![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
The US Government is attempting to force Apple to build a signed image that can be flashed onto an iPhone used by one of the San Bernardino shooters. To their credit, Apple have pushed back against this - there's an explanation of why doing so would be dangerous here. But what's noteworthy is that Apple are arguing that they shouldn't do this, not that they can't do this - Apple (and many other phone manufacturers) have designed their phones such that they can replace the firmware with anything they want.
In order to prevent unauthorised firmware being installed on a device, Apple (and most other vendors) verify that any firmware updates are signed with a trusted key. The FBI don't have access to Apple's firmware signing keys, and as a result they're unable to simply replace the software themselves. That's why they're asking Apple to build a new firmware image, sign it with their private key and provide it to the FBI.
But what do we mean by "unauthorised firmware"? In this case, it's "not authorised by Apple" - Apple can sign whatever they want, and your iPhone will happily accept that update. As owner of the device, there's no way for you to reconfigure it such that it will accept your updates. And, perhaps worse, there's no way to reconfigure it such that it will reject Apple's.
I've previously written about how it's possible to reconfigure a subset of Android devices so that they trust your images and nobody else's. Any attempt to update the phone using the Google-provided image will fail - instead, they must be re-signed using the keys that were installed in the device. No matter what legal mechanisms were used against them, Google would be unable to produce a signed firmware image that could be installed on the device without your consent. The mechanism I proposed is complicated and annoying, but this could be integrated into the standard vendor update process such that you simply type a password to unlock a key for re-signing.
Why's this important? Sure, in this case the government is attempting to obtain the contents of a phone that belonged to an actual terrorist. But not all cases governments bring will be as legitimate, and not all manufacturers are Apple. Governments will request that manufacturers build new firmware that allows them to monitor the behaviour of activists. They'll attempt to obtain signing keys and use them directly to build backdoors that let them obtain messages sent to journalists. They'll be able to reflash phones to plant evidence to discredit opposition politicians.
We can't rely on Apple to fight every case - if it becomes politically or financially expedient for them to do so, they may well change their policy. And we can't rely on the US government only seeking to obtain this kind of backdoor in clear-cut cases - there's a risk that these techniques will be used against innocent people. The only way for Apple (and all other phone manufacturers) to protect users is to allow users to remove Apple's validation keys and substitute their own. If Apple genuinely value user privacy over Apple's control of a device, it shouldn't be a difficult decision to make.
In order to prevent unauthorised firmware being installed on a device, Apple (and most other vendors) verify that any firmware updates are signed with a trusted key. The FBI don't have access to Apple's firmware signing keys, and as a result they're unable to simply replace the software themselves. That's why they're asking Apple to build a new firmware image, sign it with their private key and provide it to the FBI.
But what do we mean by "unauthorised firmware"? In this case, it's "not authorised by Apple" - Apple can sign whatever they want, and your iPhone will happily accept that update. As owner of the device, there's no way for you to reconfigure it such that it will accept your updates. And, perhaps worse, there's no way to reconfigure it such that it will reject Apple's.
I've previously written about how it's possible to reconfigure a subset of Android devices so that they trust your images and nobody else's. Any attempt to update the phone using the Google-provided image will fail - instead, they must be re-signed using the keys that were installed in the device. No matter what legal mechanisms were used against them, Google would be unable to produce a signed firmware image that could be installed on the device without your consent. The mechanism I proposed is complicated and annoying, but this could be integrated into the standard vendor update process such that you simply type a password to unlock a key for re-signing.
Why's this important? Sure, in this case the government is attempting to obtain the contents of a phone that belonged to an actual terrorist. But not all cases governments bring will be as legitimate, and not all manufacturers are Apple. Governments will request that manufacturers build new firmware that allows them to monitor the behaviour of activists. They'll attempt to obtain signing keys and use them directly to build backdoors that let them obtain messages sent to journalists. They'll be able to reflash phones to plant evidence to discredit opposition politicians.
We can't rely on Apple to fight every case - if it becomes politically or financially expedient for them to do so, they may well change their policy. And we can't rely on the US government only seeking to obtain this kind of backdoor in clear-cut cases - there's a risk that these techniques will be used against innocent people. The only way for Apple (and all other phone manufacturers) to protect users is to allow users to remove Apple's validation keys and substitute their own. If Apple genuinely value user privacy over Apple's control of a device, it shouldn't be a difficult decision to make.
no subject
Date: 2016-02-22 10:47 pm (UTC)Bottom line is, Tim Cook can say all the flowery stuff about freedom, making the world a better place and etc, but that's all a bunch of PR crap, if his company can't act where it counts. It's those crucial moments that teach us who really acts for freedom (= not just freedom for your company), and who is merely spouting feel-good bravado.
I've been reading your blog for quite a while Matthew, and even gave the address to my students so they can get a good view of the industry from an insider. One thing I always comment in class is that the tech sector is one of the biggest today, but its big companies are still pushed around by governments like a bunch of toddlers. You wouldn't see a company from a more established sector being bossed around by the government like that. I don't know, I thought maybe we already matured enough as an industry to not have this kind of thing be commonplace anymore. I recall when I just started teaching in 2002, and there was that FBI "nudge" for anti-virus companies to allow backdoors in their products, all in the name of finding terrorists. And today, we're still being pushed around the same way.
no subject
Date: 2016-02-23 02:12 am (UTC)Even if you're then not especially careful about which updates you accept, you'd still avoid any risk of an upgrade being applied while the phone was locked… no?
(no subject)
From:no subject
Date: 2016-02-23 07:25 am (UTC)So no, Android isn't more secure here.
(no subject)
From:(no subject)
From: (Anonymous) - Date: 2016-02-23 11:21 am (UTC) - Expand(no subject)
From: (Anonymous) - Date: 2016-02-23 04:04 pm (UTC) - Expand(no subject)
From:A valid thought experiment
Date: 2016-02-23 05:37 pm (UTC)Looking at the explanation in your Nexus post, why would you want to trigger a signed upgrade from the bootloader (which preserves user data)?
If upgrades are only triggered from the OS, then it can require user authentication. Which is the same as the UX you described in this post (the re-signing solution). IOW, why not require that the iPhone be unlocked, before it will approve the installation of a new update.
So I think Apple could re-design their phone so that they couldn't comply with this court order. It would be galling and expensive to have to redo it, but I can't see that it would cause any other problem.
I agree with the point you're making in general - "restricted boot" of closed-source code is very concerning. I remain interested in the work you've done, showing that secure boot can still be compatible with Free software.
-- Alan Jenkins
Re: A valid thought experiment
From:Re: A valid thought experiment
From: (Anonymous) - Date: 2016-02-27 06:28 pm (UTC) - ExpandApple
Date: 2016-02-23 06:08 pm (UTC)Indeed, we cannot - they would not even have fought this one, if the FBI would have been so kind to ask them under seal. They don't care so much about their user's data as they care about their image.
See <http://www.nytimes.com/2016/02/19/technology/how-tim-cook-became-a-bulwark-for-digital-privacy.html?_r=0>:
> Apple had asked the F.B.I. to issue its application for the tool under seal. But the government made it public, prompting Mr. Cook to go into bunker mode to draft a response, according to people privy to the discussions, who spoke on condition of anonymity. The result was the letter that Mr. Cook signed on Tuesday, where he argued that it set a “dangerous precedent” for a company to be forced to build tools for the government that weaken security.
Re: Apple
From: (Anonymous) - Date: 2016-02-23 06:18 pm (UTC) - ExpandRe: Apple
From: (Anonymous) - Date: 2016-02-24 05:16 am (UTC) - ExpandFirmware updates
Date: 2016-02-23 07:44 pm (UTC)device, i.e., wipe the old secret key.
For a legitimate use case, the backup-update-restore cycles is
inconvenient, but I doubt it needs to happen often.
Re: Firmware updates
From:Re: Firmware updates
From: (Anonymous) - Date: 2016-02-24 02:32 am (UTC) - ExpandRe: Firmware updates
From:Re: Firmware updates
From: (Anonymous) - Date: 2016-02-24 08:49 am (UTC) - ExpandRe: Firmware updates
From: (Anonymous) - Date: 2016-02-25 10:06 pm (UTC) - ExpandRe: Firmware updates
From: (Anonymous) - Date: 2016-02-25 10:43 pm (UTC) - ExpandDoes Andriod suffer from the same problem
Date: 2016-02-24 03:07 am (UTC)You mentioned that Android also signed firmware, but you can load your own keys. But it wasn't clear if Android (or at least the current Nexus phones) allow you to upgrade the firmware in it's implementation of the "secure enclave", be it Trust Zone, TPM or whatever else they use? If they do, it seems just as broken as Apple's implementation.
I'm not sure why they allow upgrading of the secure firmware at all. We had TPM's for over a decade now. I'm guessing most of them aren't firmware upgradeable, but that hasn't mattered because they are in the end very well defined and comparatively simple machines. So simple that perhaps with a bit of effort they don't need to be upgradeable because their makers can be fairly sure they have no bugs, and yet be simple enough still satisfy the no non-trivial software has no bugs "law".
Also, I notice that Nuvoton advertises their TPM's can support TPM 2.0 via a firmware upgrade. This smells like it could have the same problem which would be pretty sad. I guess you could allow a firmware upgrade if doing it destroyed all secrets it protected. I wonder if they do that?
Re: Does Andriod suffer from the same problem
From:no subject
Date: 2016-02-24 03:40 am (UTC)(no subject)
From:(no subject)
From: (Anonymous) - Date: 2016-02-24 04:40 am (UTC) - Expandno subject
Date: 2016-02-24 01:39 pm (UTC)Just a shortcut
Date: 2016-02-24 11:26 pm (UTC)Re: Just a shortcut
From: (Anonymous) - Date: 2016-02-25 03:26 pm (UTC) - ExpandRe: Just a shortcut
From: (Anonymous) - Date: 2016-02-25 03:49 pm (UTC) - ExpandRe: Just a shortcut
From: (Anonymous) - Date: 2016-02-25 09:57 pm (UTC) - ExpandRe: Just a shortcut
From: (Anonymous) - Date: 2016-02-27 07:13 pm (UTC) - ExpandRe: Just a shortcut
From: (Anonymous) - Date: 2016-02-29 09:13 pm (UTC) - ExpandThe difference with Android, signing, and freedom.
Date: 2016-03-01 02:42 pm (UTC)> Any attempt to update the phone using the Google-provided image will fail - instead,
> they must be re-signed using the keys that were installed in the device.
And the article on the Nexus devices states:
> The OS is (nominally) secure and prevents anything from accessing the raw MTD
> devices. The bootloader will only allow the user to write to partitions if
> it's unlocked. The recovery image will only permit you to install images that
> are signed with a trusted key. In combination, these facts mean that it's
> impossible for an attacker to modify the OS image without unlocking the
> bootloader[1], and unlocking the bootloader wipes all your data. You'll
> probably notice that.
To sumarize the IOS situation: The code is non-free, and the phone require the
OS to be signed and uses a chain of trust.
We simplify things and assume we're dealing with tablets that don't have a modem.
The underlying issue that interest me most is fixing the conflict between security
and freedom in the context of code signing.
How the hardware works with (ARM) SOCs:
---------------------------------------
In that case, on IOS and Android devices, the boot chain usually works that way:
The SOC(System on a chip) has a bootrom.
A bootrom is just a fancy term for instructions[] burned inside the silicon, that get
executed first at boot.
So as soon as the power is stabilized, theses instructions are executed.
To mandate code signing, usually, the manufacturer burn the hash of some
keys inside the SOC fuses[2].
This is irreversible, and if done correctly, there is no way anymore to make the
bootrom load and run your own code. Instead it will only run signed code.
The process is well documented for Freescale's I.MX53[1].
With that, you can enforce signatures of what's loaded next.
Another feature that you can commonly find on SOCs is the ability to boot from
a variety of storage devices, such as NAND, SPI flash, SD/MMC, eMMC,
and so on.
It also often support loading code trough various buses such as USB and Serial.
Which buses and storage devices it tries, and the order in which it does it is
often configurable trough resistors.
On Apple's IOS, there isn't much variation. A chain of trust is implemented.
On Android devices, even if most SOCs do support what I described above,
it varies among products and manufacturers.
On the nexus devices, the signature is enforced. This is what permits the vendor to
make sure the user partition is erased (and most probably some other stuff related
to DRM too) when the user decides to unlock the bootloader.
The vendor is also free to implement what it wish in that bootloader.
However, this doesn't give the user freedom to replace the bootloader with the one
she likes.
On some Android devices, that signature is not enforced. This is very rare on phones.
I'm not aware of any allwinner device that enforces the signatures.
As for the smartphones, the LG Optimus black[3], the Goldelico GTA04,
the openmokos, don't enforce the signature.
I've left to interpretation who "The manufacturer" and "The vendor" is.
In some cases, it's not that clear, it may not design the device
vendor/manufacturer but the SOC's.
I care more about the freedom, for users, to run the code they wish, than
about having a device that implement a supposedly[4] secure chain of trust.
Given the small amount of phones that ship with such signature enforcement
disabled, It would be great if we had a solution to propose to the hardware
manufacturers.
From an end user perspective, code signing can be used to assure the integrity
of a device's code. This is to prevent evil-maid attacks.
Keeping secrets in hardware doesn't work when you have wealthy adversary models.
They have very expensive labs and equipment that can be used to "patch" a chip.
Cheaper attacks also do succeed in many cases.
Power glitching could probably result in arbitrary code execution, if the
attacker has physical access to the hardware.
Assuming we only want to detect the lack of integrity, and react properly to it
when we can, to prevent evil-maid attacks, how do we go from there?
Denis (GNUtoo).
Re: The difference with Android, signing, and freedom.
From: (Anonymous) - Date: 2016-03-04 11:12 am (UTC) - Expand