The actual encryption key is derived from two numbers: one externally supplied, in this case entered by a user, and the other internally stored in the secure hardware. The important thing to keep in mind is that the chip performs all the crypto computations itself and no key, stored or derived, can ever be retrieved from it.
The secure hardware is built in such a way that it will destroy the secure data upon detection of any tampering attempt with the device thus rendering itself unusable. As any product of human design, there might be security holes that would allow to access this data, but apparently so far none have been found in the Apple's chip.
This kind of hardware has been used for years in payment processing terminals, where a device is in physical control of an untrusted party. Apple isn't doing really anything new, it's just another use case of an already well-known thing, so it may be quite secure now.
In this case it appears that the secure hardware stores a secure part of the key (the other being a user-entered PIN) used to encrypt the data. Noone, not even Apple, can retrieve the stored part or the derived key, and FBI is not demanding this.
The stock iPhone firmware (which may be replaced by Apple) is built to impose a delay after unsuccessful PIN entry attempts and may be configured so that after 10 unsuccessful attempts it will delete the keys from the secure hardware. It also doesn't provide any other way of entering the PIN save by hand on the screen. But this is firmware which may be updated, and FBI asks for a custom firmware build with these restrictions removed, so they can brute force the PIN. As the PIN is 4 digits, and the key computation in the secure hardware takes about 80ms it would have to crack after no more than 800s.
Note that the feds cannot replace the firmware themselves: the iPhone won't install an unsigned build, and only Apple can sign. Neither can the feds extract the firmware storage physically from the phone and replace the software directly - the secure chip would destroy the keys. That's why they need Apple cooperation.
Now, most Android devices do not have any secure hardware part, and so the stored part of the encryption key (if any - the key may also be derived from the password or PIN alone) must be stored in plain hardware and thus be retrievable. The update public key must also be replaceable.
Thus, if this device had no secure hardware (I'm not sure if any Android device on the market today has a secure hardware part) the feds could simply physically take out the storage part, retrieve the stored key part (if any) and then proceed with the brute force attack.
I completely disagree that this is an "implementation detail". It's in fact what makes a device secure in physical control of a hostile party.
Now, the OP is obviously right that Apple could and should (but won't) allow the user to replace the update public key. Yet, he's wrong to say that not even Google could help the feds if he replaced the key; in fact, absent the secure hardware, the feds could easily help themselves to it.
Power management, mobile and firmware developer on Linux. Security developer at Aurora. Ex-biologist. mjg59 on Twitter. Content here should not be interpreted as the opinion of my employer. Also on Mastodon.
Re: Just a shortcut
Date: 2016-02-25 09:57 pm (UTC)The actual encryption key is derived from two numbers: one externally supplied, in this case entered by a user, and the other internally stored in the secure hardware. The important thing to keep in mind is that the chip performs all the crypto computations itself and no key, stored or derived, can ever be retrieved from it.
The secure hardware is built in such a way that it will destroy the secure data upon detection of any tampering attempt with the device thus rendering itself unusable. As any product of human design, there might be security holes that would allow to access this data, but apparently so far none have been found in the Apple's chip.
This kind of hardware has been used for years in payment processing terminals, where a device is in physical control of an untrusted party. Apple isn't doing really anything new, it's just another use case of an already well-known thing, so it may be quite secure now.
In this case it appears that the secure hardware stores a secure part of the key (the other being a user-entered PIN) used to encrypt the data. Noone, not even Apple, can retrieve the stored part or the derived key, and FBI is not demanding this.
The stock iPhone firmware (which may be replaced by Apple) is built to impose a delay after unsuccessful PIN entry attempts and may be configured so that after 10 unsuccessful attempts it will delete the keys from the secure hardware. It also doesn't provide any other way of entering the PIN save by hand on the screen. But this is firmware which may be updated, and FBI asks for a custom firmware build with these restrictions removed, so they can brute force the PIN. As the PIN is 4 digits, and the key computation in the secure hardware takes about 80ms it would have to crack after no more than 800s.
Note that the feds cannot replace the firmware themselves: the iPhone won't install an unsigned build, and only Apple can sign. Neither can the feds extract the firmware storage physically from the phone and replace the software directly - the secure chip would destroy the keys. That's why they need Apple cooperation.
Now, most Android devices do not have any secure hardware part, and so the stored part of the encryption key (if any - the key may also be derived from the password or PIN alone) must be stored in plain hardware and thus be retrievable. The update public key must also be replaceable.
Thus, if this device had no secure hardware (I'm not sure if any Android device on the market today has a secure hardware part) the feds could simply physically take out the storage part, retrieve the stored key part (if any) and then proceed with the brute force attack.
I completely disagree that this is an "implementation detail". It's in fact what makes a device secure in physical control of a hostile party.
Now, the OP is obviously right that Apple could and should (but won't) allow the user to replace the update public key. Yet, he's wrong to say that not even Google could help the feds if he replaced the key; in fact, absent the secure hardware, the feds could easily help themselves to it.