Earlier this year, I did some brain-picking about a hypothetical new encrypted data storage system I wanted to prototype. My issue wasn't so much whether I could build it, the question was whether I should.

While describing the project to a friend, we quickly moved beyond my potential use cases to the various uses others might find for my new system:

Our discussion moved quickly beyond movie pirates to even more dangerous criminals. Protected by the same veil of anonymity as the average consumer, anyone could store anything without fear of repercussions. Stolen movies. Stolen credit card data. Espionage.[ref]Ethics in Software Development[/ref]

Ultimately, I held off on building my application. Thanks to some great discussion from the savvy team at AgileBits, I solved my personal ethical dilemma. I have a use case in mind, and that's the use case for which I'm building, optimizing, testing, and multiplying. If someone takes my software and uses it for other purposes, that's entirely on them.

iPhones

With the latest version of iOS, Apple made waves by announcing how individual users could encrypt more data than before on their devices. More importantly, Apple could not unlock a phone for law third parties without the user's consent and unlock key.

Apple was so bold as to explain how this impacted law enforcement. Officers could no longer ask Apple to unlock a device or open its data - even with a warrant - if the user wasn't able or willing to provide an unlock key.

James Comey, the director of the FBI, has made it very clear how this negatively impacts his interpretation of law enforcement:

"The notion that people have devices... that with court orders, based on a showing of probable cause in a case involving kidnapping or child exploitation or terrorism, we could never open that phone? My sense is that we've gone too far when we've gone there."[ref]FBI director: iPhones shields pedophiles from cops[/ref]

Comey is taking the same stance my friend took: assuming nefarious intent on the part of users from the onset rather than focusing on the use-cases and customers for whom a product is built in the first place. Face the facts, city parks can be used for drug trafficking. Cars can be catalysts for crime sprees. Craigslist can be used to orchestrate murder.

This doesn't mean we close our parks. We keep selling cars. For some reason, people still buy and sell on Craigslist.

Will some use their iPhones while committing illegal activity? Yes. Does this mean we should open a backdoor on every single phone sold to law enforcement to curtail that activity? Absolutely not.

Two Reasons

First and foremost, citizens in America are to be treated as if they are innocent until they are proven guilty in a court of law. In order to have their persons or property searched, law enforcement must appeal to said court for a warrant indicating the persons and places to be searched and the evidence to be seized.

Apple's change in encryption policy doesn't impede this. In fact, it just enforces that representatives of the law actually follow the law while gathering evidence. All of those pesky rules that we put in place to prevent cops from knocking down random doors, seizing random cellphones, or eavesdropping on phone calls or emails to friends are there for a reason - to protect us from unreasonable search. Do they get in the way of police work? Not if the police are any good at their jobs.

Second, and the aforementioned article quoting Comey nailed this on the head, opening phones (or any devices) to a backdoor for police exposes additional avenues for hacks and exploits.

"You can't have it both ways," said David Oscar Markus, a Miami defense attorney with expertise in police searches and seizures. "If there's a backdoor, it can be exploited."

You have a right to conduct business online. You have a right to consume and publish media online or locally on your own devices. You also have a right to control who has access to this information. All Apple has done is stand up and say they agree with these statements and are taking steps to ensure they're based in reality.

Can this new policy be abused? Absolutely. Should its potential for abuse keep it out of the market? No.