As a developer of open-source technology for health, I’m pretty hardline about personal privacy on mobile devices and on the Internet. Privacy ought to be as absolute as the law is. For example, if someone broke into my therapist’s office and pried open a locked file cabinet and took photos of his case notes on me, I’d be pretty angry, and the person who did that would be pretty jailable. We all know such a theft could theoretically could happen, but it doesn’t happen enough to be a major factor preventing people who want outside help from getting it. A similar scheme ought to exist in the digital world. But, the reality is that we are not so sure as yet that it is comparably secure. In fact, in Beverly Hills this week, an entire hospital information system has recently been held for ransom, so for the most part, we’re not at all as secure. And I want for us to be really, actually secure.
So honestly, I was surprised at Apple CEO Tim Cook’s customer letter this week feebly protesting a Federal judge’s direction to create a version of iOS that would enable the FBI to unlock Syed Farook’s phone. His grounds for protest are once Apple does so, a proverbial cat is out of its bag. This argument left me deeply unimpressed. And thinking this: we don’t get to real security through technical obfuscation and public relations campaigns. We get to real security through good engineering and acting in good faith in the public interest Apple definitely has the former (good engineering), but Tim Cook’s letter didn’t move me about the latter (good faith). Cook’s use of the phrase “backdoor” is an obfuscation, and it’s not just in his letter--his PR team has used it throughout their privacy policy, but first…
Shut the front door!
For years, Apple has had a wide-open front door to exploit our data and has used it to extend its corporate interests. A chunk of their business model since at least the iTunes store went online was leveraging customer data to monetize attention and increase sales revenue. This usage of data is built in plain sight into iOS terms of service. An iPhone 5C like Syed Farook’s by default ran iOS7, and in its 9-page
terms of use, are buried the following:
4b) By using any location-based services on your iOS Device, you agree and consent to Apple's and its partners', licensees' and third party developers’ transmission, collection, maintenance, processing and use of your location data and queries to provide and improve such products and services.
and
(i) Interest-Based Advertising from iAd. Apple may provide mobile, interest-based advertising to you. If you do not want to receive relevant ads on your iOS Device, you can opt out by going to the Limit Ad Tracking setting on your iOS Device. If you opt out, you will continue to receive the same number of mobile ads, but they may be less relevant because they will not be based on your interests. You may still see ads related to the content on a web page or in an application or based on other non-personal information.
Because Apple has such a closed system, you either agree or you go elsewhere. There is no open-source or even alternative commercial operating system for their hardware. Which means the front door is wide open: they watch where you are, who you are, and what you like to help themselves and their partners make money on you, and have already admitted they’ve given such information over to the authorities in Farook’s case.
About that “backdoor”
The difference between software and buildings is in software, there is no front or back, no street-facing or rear part of the “building”. At the bit level, the notion of a “backdoor” is an analogy, not a reality. There’s simply:
A) what a user can do from the user interface (the touchscreen, cameras and microphone on an iOS device), and there’s also
B) what any engineer can do from all the other external electronic interfaces (bluetooth, wifi, and wired connections), and there’s finally
C) what any engineer with access can do from all the internal interfaces (the wiring between each of the hardware components).
The reality is, Apple employs the engineers who have designed, built, refined, tested, and modified the circuitry that delivers all the data around the internal system. Up until quite recently, by default, this internal circuitry was fairly insecure, and Apple, especially after Edward Snowden’s revelations, made some major changes in an attempt to lock things down better. And, for the ordinary end-user, they have. However, the reality is that Apple didn’t remove a backdoor so much as simply patch up gaping holes in its security design.
Translating the legal order to technical code is naturally a bit complex, but it can be understood simply. Conceptually, there are just three lines of in the iOS startup sequence as follows that would need to be tweaked to meet the government’s requirement: a) remove the auto-erase function, b) allow an alternate means to enter passcodes than the screen; c) remove delay between retries so that d) traditional password-cracking means by trial-and-error can be carried out. A good writeup of the technical requirements of the order was just posted on the BBC’s site. Presented here as pseudocode (with code lines precedeed by |, and comment lines by //), the actual code is locked away in Apple’s iOS source code repository, but it is by no means conceptually or technically difficult to implement.
| getUserInput (maxAllowedUnlockRetryAttempts);
// functionally this defaults allowed unlock retries to INFINITE prior to iOS 7
// the current writ orders resetting of limit to INFINITE per court request
| do {allowUnlockRetry(viaUSB); unlockAttempts++; wait;}
| while (unlockAttempts < maxAllowedUnlockRetryAttempts)
// functionally, this repeats unlock attempts up to the allowed number of retries
// the current writ orders removal of wait between retries & unlock attempts via USB occurs trivially here
| if (unlockAttempts = maxAllowedUnlockRetryAttempts) {scrubDeviceData;}
// functionally, this initiates device data scrub if max retries reached
// the current writ orders removal of scrub routine, which occurs trivially here
Apple could do what the FBI is asking in a day, maybe even minutes, if they got the right group of people together in a room or on FaceTime. I have to wonder what would happen to Cook’s staunch stance about having no backdoors if there was word of a credible threat of a major attack tomorrow at a school or office building in Cupertino where their HQ is located in Silicon Valley.
1789: All Writs Act & The 4th Amendment
Hopefully we are all fans of the Fourth Amendment of the US Constitution too, prohibiting “unreasonable search and seizure”. But I am not a fan of hypocrisy. If we want highly private uses of mobile technology to be relied upon, we need to move forward through conflicts like this, not handwring about not creating backdoors which actually already all but exist. This move by Apple to refuse the writ of “reasonable technical assistance”—for law-student posterity, let’s call it a writ of technici auxilium aequum—is more a calculation about improving market cap and averting bad PR.
Notably, it required another 1789 law, the All Writs Act, part of the Judiciary Act of 1789, to compel Apple to comply, and the reason for this is very simple. The point of the All Writs Act is that courts, when needed, can specify exactly what must be done to satisfy and fulfill the law because every scenario cannot be foreseen a priori. In this case, the Framers who wrote the 4th Amendment that year couldn’t possibly have foreseen that a company would dubiously claim to make a device so secure it could not be defeated that a judge would have to direct them to dismantle their obvious, basic security features to conduct a reasonable search. For this reason, there is this very broad and general power for courts to fulfill the law of the land.
To wit, Apple well knows there is such a thing as reasonable search and seizure. And it’s my view that if Apple wants to use US currency, US-funded basic research and US territory to develop and purvey its products, it should find a non-hypocritical response to the judge’s order. If I were related to any of the 14 dead at Farook’s hands, I'd be hopping mad. As it is, I'm pretty nonplussed just based on the complete lie that is the technical argument in Cook’s letter:
"the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession. The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."
This is just malarkey.
I don't see lots of Apple's iOS source code dumps wandering around the internet. Apple's iOS code is notoriously tightly controlled already, even within Apple. Should they see fit, they could work with the NIST and the NSA to create another level of human and technical controls & related protocols to preserve chain of custody while ensuring this particular tweak & compilation of iOS is as restricted access as nuclear codes, and then make it available to their own in-house staff only in special cases like this. They're just afraid if they do, then every judge and police department in the world will be asking for it, and they likely will — which means they'll have to set up a forensic service where they keep this capability under lock and key, and which they could charge for just like any other forensic lab providing specialized analysis useful in investigations. Again, they could do this and also secure such facilities appropriately, where physical devices are sent in to them & data unlocked and made available. They could probably have one such facility on each continent, and make governments pay appropriately for the level of heightened security required. In this case, the government already has been mandated to pay Apple for their tech support.
The reality is that secure, instant point-to-point communication anywhere on the planet is something of a superpower, one that shouldn't be left utterly uncontrolled for individuals to run society amok as Farook and his wife did. Especially after public funded infrastructure and resources are 99.999% of what make it possible, the companies profiteering off creating the tools for such powerful communication should be responsible and forthright about security engineering, rather than flout the law. We need to know what's really possible when those powers are abused by individuals and what’s possible needs to be reined in so that we only use those powers when truly necessary and appropriate. It’s a balance. We’re going to see a lot more writs of technici auxilium aequum, and so long as they’re handled forthrightly and honestly, that’s a good thing.