Two good pieces of news today. Both Apple and Google have announced that the most recent versions of their mobile operating systems will encrypt user data by default. Google:
The next generation of Google’s Android operating system, due for release next month, will encrypt data by default for the first time, the company said Thursday, raising yet another barrier to police gaining access to the troves of personal data typically kept on smartphones.
Android has offered optional encryption on some devices since 2011, but security experts say few users have known how to turn on the feature. Now Google is designing the activation procedures for new Android devices so that encryption happens automatically; only somebody who enters a device’s password will be able to see the pictures, videos and communications stored on those smartphones.
Rather than comply with binding court orders, Apple has reworked its latest encryption in a way that prevents the company – or anyone but the device’s owner – from gaining access to the vast troves of user data typically stored on smartphones or tablet computers.
The key is the encryption that Apple mobile devices automatically put in place when a user selects a passcode, making it difficult for anyone who lacks that passcode to access the information within, including photos, e-mails and recordings.
This does not stop data leaking from just about every application that you run on either platform. Who needs encryption when your phone readily gives away the data?
Exactly. Google doesn’t care to enable server to server encryption in Google Talk for example (because they are pushing all their users to non federated Hangouts). Since most XMPP servers already made such encryption mandatory, Google Talk contacts became simply cut out for users of other XMPP services.
In addition to any data apps send, you’d be a fool to think these companies don’t have a master key they can give to law enforcement. They don’t care about you more than they care about their futures, and openly thwarting governments will get them in a world of trouble they don’t want. So, the need to make the users feel safe while not putting themselves under the guns…
Android’s source is open, you can go ahead and check that there’s no such back door. Of course you then need to run the verified code, not just run a commercial off-the-shelf build, but many privacy-conscious people are already doing that with custom firmware builds. Alternatively, you can check the binary to make sure the built code corresponds to the source, though this is more labor intensive and error-prone.
As for iOS, I agree, all we have to go on is Apple pinky-swearing that it won’t build in a back door. I don’t think anybody who is privacy- or security-conscious will take them at their word, though.
In order for encryption, such as AES, to be effective, the encryption keys may not be stored and they may not be derived from low entropy sources such as a pin (realistically how many mobile passwords have more than a trivial amount of entropy?).
Anyone know if either will use hardware based security chips that delete the keys after too many attempts? It’s a precarious balance. If the keys self destruct after bad login attempts, then that could easily result in data loss accidentally or intentionally (oh no, the kids deleted everything).
Both platforms support device wipes after X unsuccessful passcode attempts. And yes the potential for data loss is high.
leos,
I kind of figured that, but I didn’t know if this crypto key vault was implemented in silicon, or merely a software feature? If the later, then it wouldn’t surprise me in the least if existing device cloning technology could be used as is. Obviously any brute forcing which can be run on a cloned image will have no limit on the number of attempts.
The reason I’m asking this is because the sources don’t make clear whether the encryption is done in software (to deter casual break ins), or in hardware needed to be truly effective even against well equipped forces who are able to completely bypass the normal software stack.
Sounds like all this crypto is for nothing.
Against an unskilled attacker, it buys you nothing: you don’t need crypto: just have it wipe on enough failed attempts.
Against a skilled attacker (.gov), it buys you almost nothing: just copy the phone’s store, and brute-force away.
I don’t know about you, but I’d rather have my phone wiped if a thief snatches it off a bar, than give them full access to my email and all the associated password reset capabilities of every online service.
This is going to be ready news for most Android users when the next version of Android finally spreads through the user base. No rush.
Although Android’s existing implementation is kinda flawed in that it mandates the screen lock code to be the same as the one used as the passphrase to the master decryption key, this is merely a GUI problem. Internally, the keys are kept complete separate and on a rooted device you can change it, so that you enter your complicated master decryption passphrase only when you boot the device, and then use the short PIN only for screen locking.
That way an adversary who gets your device will be stopped by the screen lock retry limit when the device is powered on and when they power it off to get at the flash, they’ll be stopped by the much more complicated passphrase used to encrypt the decryption key in persistent storage.
Apple has a new page about their stance and details on privacy here: http://www.apple.com/privacy/
This is close to useless: most of the interesting data is either backed up in the cloud, and is thus reacheable by law enforcement with a warrant or the NSA without a warrant, and backup is still on by default, so the fact that encryption becomes on by default is worthless; or the interesting data is peripheral to the device, not inside it (e.g. your location, broadcasted all the time; see http://www.washingtonpost.com/business/technology/for-sale-systems-… for a good example of how easy it is for anybody with a few thousand dollar to track you around the globe).
Indeed, my thoughts exactly!
Unless this new device side encryption means that the data is no longer transferred in a clear text state to the backup service.
This would be a real improvement, but I doubt that is what this is about. More likely a form of security theatre.
Nope. Otherwise it would be unreadable on other devices – except if they have the same password -, and unshareable.
Alo, what is the password used for that encryption? The 4-digit PIN? Something stronger, more resilient to brute force?
See also:
http://www.theverge.com/2014/9/18/6404767/apple-offers-mixed-signal…
This was meant rethorical or sarcastically, since it would be a miracle biblical proportions if any of the big tech companies did something even remotely improving their customers’ safety, let alone “voluntarily”.
Isn’t that already the case? Do any of these cloud services do unauthenticated login?
I am not aware of any mobile platform without content sharing framework.
It is one of the things people expect a platform to have, no?
Anyway, what annoys me most is that the media, in this case Washington Post, allows these companies to tout this as an improvement instead of calling them out for the lack of it until now.
Whether you back up your data to the cloud is completely under your control. You might want to, you know, not do it if you’re concerned.
NSA,FBI,DEA,DHS spent Millions of USA Taxpayer reveunues in multiple full page articles complaining about how ‘secure’ and ‘unbreakable’ Apples iMessage encryption is.
In less than 24hrs I had confirmed ALL iMessage encrypted messages were stored as PLAIN-TEXT on Apple’s Servers that were backdoored by the NSA and likely several other domestic and foreign agencys had access to that data in plain text. All the glossy articles, appearances by various agency heads were all a Psyop on USA citizens and Global citizens to steer them into a false notion the Apple iMessage encryption was somehow secure from Obama’s Goons and the State Sponsored PoliceState thugs.
Now it appears Google is attempting a very similar ploy with its users turning down other forms of asymetric ciphers and depending on Google’s encryption where it is entirely likely Google has plaintext copys of everything ‘encrypted’ by Googles new encryption and that like Apples servers, Googles servers are almost certainly backdoored by NSA et al.
Such chestnut routine ploys might succeed in roping in afew noobies thinking whatever they encrypt with such known to be allies of NSA companys to actually be secure and protect their privacys. In all likelyhood, using it is about the same as printing it in plain-text on a postcard and mailing it to the NSA yourself.
I don’t think Window’s MicroSoft, Apple and Google will ever regain users trust and creditability given the mass invasions on citizens freedoms, libertys and privacys. Especially MicroSoft who allegedly helped the Mossad and NSA create Stuxnet, Duqu and Flame Virus/Worms which infected their global customers using Windows including across America.
If you are concerned about security and encryption, no way would I suggest nor endorse a inhouse Google encryption nor Apples. Obama’s goons have undoubtedly paid a visit to both corporate headquarters armed with various threats ‘or else’.