Apple has every right to challenge a federal magistrate’s order to help the FBI unlock a terrorist’s cellphone. This is a legal issue that needs to be resolved, and it is a discussion the nation needs to have.
However the underlying principle is this: No storage device, whether it’s an old-fashioned filing cabinet or the most advanced iPhone, can be immune to a court-approved search. If Apple can provide a “key” to unlock the phone used by a terrorist, Apple should be compelled to do so.
Syed Farook and his wife, Tashfeen Malik, killed 14 people and wounded 22. The FBI has the iPhone supplied to him by his employer, San Bernardino County.
Farook’s phone is password protected. The FBI wants Apple to help disable a function that erases the phone’s data after multiple incorrect attempts to enter that password. Then the FBI could use other software to enter passwords until it guesses the correct one.
Apple says helping the FBI would be an undue burden. Apple further argues that forcing it to write code to unlock the phone compels speech and violates the First Amendment. While code can be deemed speech, this seems more like a high-tech version of requiring Apple to construct a key.
Apple also argues that Congress, rather than the courts, should decide the case. Congress must get involved in this issue, but this case already is in court. Waiting for Congress could take so much time that information on the phone becomes useless.
We understand that Apple customers need to trust that its phones are secure. But information on the phone might lead to other suspects or to people planning future attacks. That must be given great weight in the attempt to balance the needs of law enforcement against the need to provide reasonable security to iPhone users.
Not that Apple should be compelled to unleash a tool that would allow hackers to plunder its customers’ phones, which contain an increasing amount of personal data. The goal is to protect against hackers without depriving law enforcement of evidence that could save lives.
The answer is for the courts — and, increasingly, Congress — to define the scope of cooperation that companies like Apple, Google, Microsoft and the others must provide. Tech companies should be allowed to comply by using the narrowest scope possible. They should not be compelled to build a “back door” into every device. Apple claims that’s what the government is asking it to do in the terrorism case; the feds say that’s not true.
History shows that the potential for government abuse is serious and real. That means oversight and penalties must be serious and real — and they haven’t always been. No wonder civil libertarians are worried about a precedent that would give the government sweeping powers to unlock personal data.
On the other hand, law enforcement officials rightfully are equally concerned about a precedent that precludes government from gaining access to digital evidence.
This is not just an issue in terrorist cases. Many law enforcement agencies want easier access to digital devices. This is another area for courts and legislative bodies to explore. If companies must provide help in terrorism cases, what about cases of murder or child pornography? What about fraud or stalking or burglary? The questions are extensive and complicated.
The FBI and the county that technically owns the iPhone 5C reportedly made some mistakes. The county, perhaps at the FBI’s request, reset the password, unaware that doing so would block access to some data. And the county had paid for — but failed to install — software that would have preserved access to the phones it supplied to employees.
Those mistakes should provide important guidelines to other jurisdictions across the country so they can preserve access to data on corporate or publicly owned phones.
But the mistakes do not change the fundamental equation. Digital devices must be subject to lawful search.
Editorial
Sun Sentinel
(Tribune Content Agency)