There has been a lot of activity these past few weeks concerning data encryption, law enforcement and what smartphone manufacturers should and should not be including in their hardware/software with respect to encryption. The states of New York and California are both proposing laws that would prohibit smartphone manufacturers from selling phones in their states that are incapable of being decrypted by the manufacturer at the request of law enforcement. In other words, these 2 states are requiring that a "back door" be built into smartphones for law enforcement.
Before I get into why these laws are an extremely bad idea for the general public, I want to make sure everyone understands the basics of data encryption. Tom's Guide has a really well written short data encryption explanation that I encourage you to take a few minutes to read. It does a good job of explaining the basics of encryption. At the heart of the matter is both the hardware encryption that is being built into smartphones ,like the iPhone, as well as the data transmission encryption that is part of secure instant messaging services like Apple's iMessage and Apple's iCloud email (which both use encryption to send and receive data). Both of these types of encryption require "keys" to encrypt and decrypt the data, a private and a public key. Wikipedia has a write-up on how these 2 keys work with respect to data encryption. The public key is, well, something that you give out publicly so someone else can send you an encrypted message. Within Apple's Mail and iMessage applications this "public key" is all handled in the background by the iMessage and Mail applications without you having to even know these keys exist. The private key then only exists on your personal device. So that private key is needed by iMessage or Mail to decrypt any incoming message so that you can read it. Again, this is all done by the applications on your device without you even having to worry about it. That is, until lawmakers started trying to introduce legislation to change the way this works.
The private key I just described only exists on your personal device and nowhere else. That means that a company like Apple, whose servers all of these encrypted messages pass through, cannot read any of your encrypted messages. Why not? Because they don't have the key. Without the private key there is no way to get to the information in the message and read it (it just shows up as scrambled garbage). This is the whole premise behind encrypted communications...privacy and data security. Until lawmakers step in that is. The laws being introduced by the states of New York and California, if passed, would make it illegal for a company like Apple to sell smartphones in these states unless Apple had a way to decrypt the data on the device and the messages going through Apple's servers at the request of law enforcement. That would mean Apple would have to store and have access to your private encryption keys (which right now only exist on your personal device).
So this doesn't seem so bad, right? These lawmakers hearts are in the right place after all. They just want to be able to stop terrorists and child pornographers and such and these encryption technologies are closing a loop hole that law enforcement has been using for many years to catch these criminals. I'm all for catching criminals. But what about when these criminals want to come after your data? Shouldn't we be allowed as citizens to protect our personal data? Think about all of the sensitive information you have on your smartphone right now. Most smart cybercriminals only need a few pieces of information from your device to start wreaking all kinds of havoc on your life, like stealing your identity. How many of you have received a new credit card in the mail because your card was "compromised" and required a new card number? This generally happens to us at least 2 or 3 times a year. This is because either your credit card issuer or a vendor you used your card at was unable to protect your credit card information. Your credit card is a single piece of digital information and yet it seems like it is always getting compromised. If these lawmakers get their way, all of the information in your smartphone will soon be just as vulnerable as your credit cards you are having to constantly get replaced. These proposed laws would require Apple to store all of our private keys on their servers so that if law enforcement wanted to gain access to our phones as part of a criminal investigation they could subpoena Apple, get your private key and gain access to all encrypted communications going through Apple"s servers and everything on your phone (if they have it in their possession). If you aren't a criminal and don't plan on breaking any laws anytime soon then why would should this bother you?
Let's just ignore for now the whole side of this argument that says you don't trust law enforcement. To keep it simple we will just assume that law enforcement will always use this "power" for good and never go after someone by mistake. The other problem with forcing Apple to store your private keys is that your private keys are now vulnerable. Apple must protect these keys from criminals. Think about it. Lawmakers have now painted a big sign on Apple for criminals letting them know that the keys to millions of digital lives all reside at Apple. Think of it as the biggest and most important bank on the planet. Apple doesn't want that responsibility. They are first and foremost a hardware company and only get into services to support the hardware they are designing and selling. This law would require Apple and other companies to put a lot of effort and money into protecting everyone's private keys. The other way to look at this is by looking at the deadbolt on your front door. If you went into Home Depot to buy a new deadbolt for your house and printed on the package of every single lock you could buy was the following message:
"Be aware, by law we must build in the capability for law enforcement to be able to open this lock. So this lock is designed to work with a special key that only law enforcement has access to"
Would you feel good about using that lock to secure your house? Everything you own is protected by this lock you are about to purchase and right there on the package it says that someone else already has a key to that lock.
Smartphones have become such a staple of our society. They really do contain our digital lives and as such they should have the ability to protect the data we store on them. These laws would open up a hole in the security that is currently built into these devices, a hole that can and will be used by criminals to gain access to our data. These laws don't make it illegal for you personally to then take an iPhone and add your own encryption software or capability to them, but let's be honest...the average person is not going to bother to do this or even have the technical know-how to do it. What these laws are really doing is telling the average citizen that in order to make the lives of law enforcement a little bit easier we have to give up our data security. This is pretty tough pill to swallow. Rather than punishing criminals, law enforcement is across the board punishing the general population. The people writing these state laws don't understand technology. If they did, they would realize that the criminals and the terrorists they want so badly to catch are already using other methods to encrypt their communciations. These state laws would really only hurt law abiding citizens and do very little to deter criminals or aid in their arrest.
As I was finishing up writing this post, news broke that California Congressman Ted Lieu has drafted a federal law that would prohibit States from making smartphone data encryption illegal on a state by state basis. This bill would essentially protect individuals who want to buy a smartphone with full data encryption capability built-in (and of course, protect the companies that sell these smartphones). This is just the beginning, but it is refreshing to see a bill even in draft form that is backed by common sense and a little bit of IT know-how (Lieu is one of 4 Congressmen with a computer science degree). More to come...