It’s not the technology that’s the problem, it’s protecting the secret
U.S. Attorney General William Barr recently added his voice to governments, law enforcement agencies, and security agencies calling for—just shy of outright demanding—backdoors into devices and encrypted chat apps. The rational is always the same: strong encryption protects bad people and allows them to continue to do bad things to the rest of us. If law enforcement/security agencies could just peek into conversations now and then—with warrants of course—their job would be tremendously easier and all will be right in the world.
While allowing law enforcement into secure chat apps seems like a good thing on the surface—and technically there are many ways to achieve this goal—the end result would be critically weakening the technologies we use to protect our banking, shopping, and online communications. Creating a backdoor isn’t a challenge, it’s creating a backdoor that doesn’t weaken or jeopardize security that’s impossible. Once a backdoor is put in—even with the best of intentions—it’s only a matter of time before it’s exploited by the people it’s supposed to protect us from in the first place.
This is frightening. Trump's AG Barr is insisting all encryption must have backdoors. As we have seen w massive, highly destructive leaks from NSA, if there is a backdoor the bad guys will most likely get it too. Then, they'll have access to the entire global financial system. https://t.co/vsVtqBvQ5Q— James Fawcette (@TheFawcette) July 24, 2019
No one is arguing that criminals should be allowed to work unfettered by the law. In all the reading I’ve done over the years about this issue from Apple refusing to unlock the San Bernardino gunman’s phone to Australia’s law passed last year to the most recent comments by A.G. Barr no one in the technology community is arguing for the rights of criminals or terrorists. No one. What we are arguing for are the rights of the rest of us.
It’s not creating a door that’s hard, it’s keeping it closed that matters
This post isn’t going to debate which of the myriad “solutions” offered by law enforcement or government are better or worse. From hidden secret keys on devices to ghost protocols to encryption master keys—all of them are flawed from the outset. All options to give law enforcement access, to prevent criminals from “going dark”, hinge on a single flawed assumption: no one will find the backdoor and exploit it against the innocent. That’s it. That’s the bottom line. And that’s the problem. Once you put a backdoor in place it’s only a matter of when, not if, the people who aren’t supposed to have access to the backdoor, get it—or at least know about it.
Cracking open the backdoor will, without question, take at least two forms: hackers will try to exploit it—finding weaknesses in the protections in the door itself—or someone on the inside selling or abusing the backdoor. Hackers already spend thousands and thousands of hours trying to break into systems that are supposed to be impenetrable. We have malware attacks on Windows and MacOS. We have rogue apps on smartphones. We have continuous attacks on servers that hold our data. Trying to exploit a “secret backdoor” in apps or devices will jump to number one on their to do lists.
Hackers are very successful at breaking into secure systems. Hacks like Equifax and Dejardins have devastating consequences to millions of people. And some hacks, like the Ashley Madison hack a few years ago or the recent Capital One hack, are inside jobs. Someone who was trusted with the keys to the kingdom betrayed that trust. These are systems with people actively watching for and thwarting attacks from all sides and they still get hacked.
Now imagine if there was a secret key that could unlock any iPhone on the planet. Unlock and see everything on the phone that is supposed to be encrypted, because we’ve been assured the data is protected. Even if this secret key is only for law enforcement, how long do you think it will take for the key’s existence to be discovered? Years? Months? Weeks? Days? Hours? If a key with this much power is out there, you can bet as soon as it becomes known, it will be targeted by exactly the people the key is supposed to protect us from.
Find that key, even if the solution is successful only a small percentage of the time, and you have a license to print (steal) money. A master key is too big a target for criminals to pass up. There is too much money at stake to let pass. You can bet criminal gangs will pull out all the stops to find the key.
Once found, once exploited, encryption backdoors will undermine the trust and faith people have in online banking and shopping. Even with hacks at banks and other sites, people still shop and bank online. People still trust the system because none of the hacks have broken the encryption that is the foundation of protecting our data. Making sure our encryption foundation is one of the most important reasons we use 521 bit ECC as the core of Sky ECC.
What gets broken into are the other systems—and often the systems weren’t being protected well enough in the first place. A development server had passwords in plain text. A database was encrypted, but the decryption key was left unprotected on the same server (essentially locking your door from the inside, but leaving the key in the lock outside).
There hasn’t been a hack in the modern online era that successfully cracked underlying encryption. Password crackers don’t break encryption, they just keep guessing until they get the right answer. We know what you get if you put “abcdefg” into any of the common encryption algorithms. So you just compare a list of encrypted passwords against lists of words and common passwords until things match up.
If you use randomly generated passwords, the password won’t be on the list to check against (especially if the password is long enough). If encryption were truly broken, it wouldn’t matter what you used as a password, it could be decrypted. This isn’t the case today.
What’s worse, though, is by its nature the existence of a backdoor won’t become public knowledge—even after it’s been compromised. Technology companies would be sworn to secrecy about its existence. Law enforcement would deny its existence. Which means when the backdoor is found and exploited, the public won’t learn about the breach long after the horse has left the barn.
Do you think anyone would admit their secret backdoor used to read messages and decrypt phones fell into the wrong hands, much less exists?
Not on your life it won’t.
What would happen is, suddenly criminals would learn how to circumvent the backdoor—most likely developing their own apps—and protect their communications again—returning to “going dark”—and at the same time there would be an uptick in mysterious hacks and attacks that shouldn’t have been possible. Passwords and information found and publicized that were demonstrably communicated securely. People will put two and two together and get five. Not only revealing the existence of the backdoor, but its abuse, leaking, or discovery by hackers.
Then we’re back to square one and hooped in the process.
What’s good for the goose is good for the gander
Interesting in all the law enforcement requests for backdoors in our communications tools, governments still want their secrets protected. The argument of “we have secrets we need to protect…” doesn’t hold water when the secrets guarded by governments can be as atrocious as those hidden by criminals. Governments aren’t suggesting a citizen oversight committee can hold a secret key to decrypt their information. That wouldn’t be secure. What if those keys “fell into the wrong hands”?
Governments can’t expect people to accept sacrificing privacy, freedom of speech, and security while staunchly defending their right to the same. Everyone needs private, secure communications tools. Everyone needs secure banking, secure shopping, secure…online access.
The good of the many, outweighs the needs of the few—or the one.
In the end, the entirety of the discussion comes down to the singular Vulcan saying: “The good of the many outweighs the needs of the few.” We know bad people do bad things with technology. Bank robbers use cars to escape the police, but there aren’t universal kill switches in cars. Our society, our commerce, and our connected lives depend on secure encryption free of backdoors. We believe the cost to everyone, to society as a whole, is too great to put backdoors into apps or devices—and especially not Sky ECC.
That’s the bottom line. It’s a line in the sand for us. We won’t put a backdoor into Sky ECC. When asked, and given all the proper legal paperwork, we comply with requests from law enforcement. That doesn’t mean we will (or even can) decrypt messages. Sky ECC is built to have only the barest amount of information stored on the servers.
We only have a list of ECC IDs—without any personally identifiable information—and who they have in their contact list. Messages aren’t kept. Metadata isn’t kept (it’s encrypted in transit too). Access logs aren’t kept. We believe your privacy and communications deserve the very best, the most private tool available. We believe that tool is Sky ECC.
There is no end in sight for the continuous pressure law enforcement and governments put on companies like Facebook, Apple, Google, and us, to “just find a way for us to see what criminals and terrorists are saying”. And we will resist those efforts because we don’t just believe there is no such thing as a safe backbook and, more importantly, we fundamentally believe everyone has a right to secure communication for privacy and security.
That’s our stand. No backdoors. Our privacy is too important to risk the cost.