In the past, device makers have focused on safeguarding these keys by storing the keys in secure locations and severely restricting the number of people who have access to them. That’s good, but it leaves those people open to attack by coercion or social engineering. That’s risky for the employees personally, and we believe it creates too much risk for user data.
To mitigate these risks, Google Pixel 2 devices implement insider attack resistance in the tamper-resistant hardware security module that guards the encryption keys for user data. This helps prevent an attacker who manages to produce properly signed malicious firmware from installing it on the security module in a lost or stolen device without the user’s cooperation. Specifically, it is not possible to upgrade the firmware that checks the user’s password unless you present the correct user password. There is a way to “force” an upgrade, for example when a returned device is refurbished for resale, but forcing it wipes the secrets used to decrypt the user’s data, effectively destroying it.
Sounds like Apple’s Secure Enclave from 2013.
Android lagging Apple by five years, people.
Precisely my thoughts is the matter. Though it’s worth noting that Apple’s security enclave is an Apple style reimplementation of the TPM concept which in itself is an on-board smart card-like device.
Apple still has the most elegant design of the concept. Being able to do facial and fingerprint recognition in the Security Enclave itself is a spectacular idea. If you look at the Apple silicon, it’s simply impressive: an ARM chip with a custom OS for the security enclave.
Great. One more embedded SoC with closed firmware that has elevated access to the system and is next to impossible to audit or inspect.
Score minus one? People don’t like hearing facts?