One of my favorite movie quotes is from Jurassic Park. While discussing the new advances in genetic technology, Ian Malcolm exclaimed:
Yeah, yeah, but your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should.
I never thought it would, but someone posed nearly the exact same question to me this weekend.
I love learning new things. Lately I've been spending much of my free time studying cryptography. Numbers are fun to me, and being able to apply advanced math to computer science helps make my job that much more interesting.
Also, cryptography involves a great deal of data security. In a world still reeling from recent revelations in data security, understanding the implications of various cryptographic systems is huge! In the past few months, I've even brainstormed a few interesting applications of my own.
One project in particular deals with anonymous, encrypted data storage in the cloud. It would allow you as a consumer to put whatever you want on a remote server, anonymously, and with complete security. No one would know what data you'd pushed up. No one would even know who you are. It's the kind of world I dream about.
But the question remains, though I can build this system, should I?
An Ethical Quandary
As I explained the inner workings of this system to a friend, they immediately jumped beyond my planned use with mission-critical business data storage to media piracy. Would this system allow video pirates to store whatever data they want in the cloud, securely, anonymously, and allow them to share this data with anyone they wish?
Our discussion moved quickly beyond movie pirates to even more dangerous criminals. Protected by the same veil of anonymity as the average consumer, anyone could store anything without fear of repercussions. Stolen movies. Stolen credit card data. Espionage.
When I first devised this idea, I was thrilled that its potential applications were seemingly limitless. Now, I'm terrified.
To Release, or Not ...
Where do we, as software developers, draw the line? Do we build software with the common good in mind and ignore the potential consequences of releasing such tools to the world? Do we hold back potentially groundbreaking achievements because they could be misused by the wrong hands?
I don't even know how to begin to answer that question.
One potential comparison is the US legal system: an adversarial system designed based on the assumption that people are, basically, good. No innocent man or woman should ever be treated as guilty without just cause and overwhelming evidence. Where we'd rather a hundred guilty people go free than a single innocent be locked behind bars.
Many would argue, though, this is a broken system.
Where would you, personally, draw the line? To which ideal do you owe allegiance? That advancements for the greater good should always be released? Or that advancements with potentially devastating consequences should be silenced?
Personally, I'm on the fence and I'm hoping someone much smarter than me can offer a compelling argument one way or the other.