By now, you may have heard that the FBI is asking Apple to help get into the iPhone of one of the San Bernardino shooters. WIRED has a good summary of the situation.
Some thoughts after spending a few days thinking, reading, and arguing the issues involved:
* The government forcing Apple to write special software so that they can brute force the phone's security would be an unusual, possibly unprecedented, power grab. Companies have been compelled to turn over tools and information they possess, but as far as I know they've never been required to actually write new software to support a government investigation or request.
* As WIRED notes, the encryption involved is itself not practically breakable, and Apple has no capability to remotely unlock it, so eliminating the limitations on passcode attempts is the only option left.
* I do not agree with Dan Guido (cited in the WIRED piece) that what is being asked is "reasonable." It is likely _doable_, but that isn't the same thing.
* Should the government get its way on this, it's hard to imagine there won't be future efforts to force companies to undermine their devices' security measures. If that happens, what's to stop the eventual outlawing of effective encryption altogether?
* As has been said many other places: a backdoor for the FBI is a backdoor for everyone. No system can be secure if it has a backdoor built into it, no matter how "limited" it is, and no matter how many assurances are given that it will only be used in "exceptional circumstances."
It must be noted that Apple has, due to previous incidents where it was compelled to unlock iPhones as part of criminal investigations, it stepped up security specifically to prevent it. Now, some of those newer security measures are impeding an FBI investigation, and so the FBI is demanding help accessing the device anyway.
Even if Apple eventually manages to cut off every channel through which the government could force the company to compromise a phone, if this order stands, Apple (and any other company) may simply be forced to write tools to circumvent the very measures they designed to avoid circumvention. Should it come down to the government realizing that everyday users have easy, unfettered access to strong security that can’t be effectively hacked or backdoored, legislation may result which compels device manufacturers (and software developers) to intentionally leave backdoors in their devices and programs in order to comply with government requests. Obviously, if this state of affairs is forced on American companies operating in the US, it’s not a stretch to see it applied to American companies operating overseas, as well–particularly under oppressive regimes that may have much more nefarious and abusive intentions with regard to accessing to their citizens’ data. By no means is the US government innocent in this regard, but thus far it has been unable (though not unwilling) to unilaterally abridge the information security of all citizens’ electronic devices.
This will certainly be a case to watch. I can’t say I’m much of a fan of Apple, but they are standing up for the right principle here. There is a battle ongoing between governments that want limitless access to their citizens’ digital lives, and individuals and companies who consider such access a violation of civil liberties. While I agree that the FBI is entitled to the data on the phone in question, Apple should not be obligated, legally or otherwise, to compromise the phone’s security in order to fulfill that entitlement, because the end result would be to compromise many more phones that have not been associated with criminal acts, and there will be no accountability once a backdoor has been established–fraudsters could use it just as readily as law enforcement. It would be sacrificing the security of many to satisfy the investigative demands of a single case. That math just doesn’t add up.