An In-depth Look At Why We Believe Your Privacy Matters

SKY ECC is designed to protect your privacy and security

You know the line. You’ve heard it countless times…if you have nothing to hide, you have nothing to worry about. And we all know as soon as someone makes that statement, it’s time to start worrying.

That statement, and all its variations, is always, always, the excuse for increased surveillance, for why the government should be able to have a backdoor into encryption algorithms, why our privacy needs to be violated for the “greater good”. Why providing your social media handles crossing borders is to protect us from terrorists and criminals.

And we know it’s bullshit.

Today more than ever we need to work to protect our privacy. As more apps are shown to leak and betray privacy—even unintentionally—we need solutions to both connect us and protect our privacy. At Sky Global, we believe everyone has a fundamental human right to privacy. We believe people should be able to communicate and connect with colleagues and contacts privately. This is why Sky Global was founded. It’s why SKY ECC was created.

We’ve come to accept evermore invasive encroachments on our privacy, and sometimes we have no one to blame but ourselves, but it doesn’t have to be this way. We don’t have to accept eroding privacy rights. We can use technology to both protect ourselves and help change things for the better.

The death of privacy by a thousand likes

Some of the blame for the state of online privacy falls squarely on us. Social media changed how we see and interact with the world. There has been a lot of good from the social web, but a lot of bad too. As we started to share more and more, we became numb to everything we’re sharing becoming public forever. We forget the amount of information we share. We forget all those quizzes we take online, gives advertisers reams of information about ourselves. We share memes with places we’ve lived, countries we’ve visited, foods we like and don’t like, give up telltale bits of information that we, ironically, try to keep secret for banking sites to confirm our identity. We unwittingly help phishers and scammers craft better and better ways to hack and steal from us.

There was a joke around a few years ago that if the CIA and FBI want to get people to give up all sorts of private information (sexual orientation, political beliefs, friends, colleagues, hobbies) voluntarily, Facebook would be the perfect tool. While this is certainly not true and a joke, we’ve certainly done that. We’ve made it very, very easy for anyone from marketers to police to governments to create a very accurate picture of us just from our social media footprint. It’s possible to determine someone’s sexual orientation from their public social media information—even when they don’t explicitly state their orientation explicitly (or implicitly).

As the EFF puts it, as part of its amicus brief on blocking requirements for U.S. Visa applicants from disclosing their social media profiles:

Social media profiles paint alarmingly detailed pictures of their users’ personal lives. By monitoring applicants’ social media profiles, the government can obtain information that it otherwise would not have access to through the visa application process. For example, visa applicants are not required to disclose their political views. However, applicants might choose to post their beliefs on their social media profiles. Those seeking to conceal such information might still be exposed by comments and tags made by other users. And due to the complex interactions of social media networks, studies have shown that personal information about users such as sexual orientation can reliably be inferred even when the user doesn’t expressly share that information. Although consular officers might be instructed to ignore this information, it is not unreasonable to fear that it might influence their decisions anyway. EFF to Court: Social Media Users Have Privacy and Free Speech Interests in Their Public Information | Electronic Frontier Foundation

If our public social media profile paints that accurate a picture of us, imagine what including private messages and documents say about us.

Technology doesn’t always get it right

Stemming from the world-wide protests against racism and police brutality, facial recognition has become a hot topic. There is a real fear FRT (face recognition technology) will be used to find and silence critics of the police and authorities. As public concern grew over FRT, companies got out of the sector entirely trying to stay on the right side of history and ensuring they weren’t backing a lame technological horse. There are so many issues with FRT—racial bias being one of the most troubling—right now is not a good time to help police or other agencies try to identify us from photo/video surveillance.

We started to see a backlash against facial recognition technologies during the democracy protests in Hong Kong, but more recently the case of a Detroit man wrongly arrested because facial recognition identified him as the suspect in a robbery has galvanized people against FRT:

The American Civil Liberties Union filed the complaint ( PDF ) Wednesday on behalf of Robert Williams, a Michigan man who was arrested in January based on a false positive generated by facial recognition software. “At every step, DPD’s conduct has been improper,” the complaint alleges. “It unthinkingly relied on flawed and racist facial recognition technology without taking reasonable measures to verify the information being provided” as part of a “shoddy and incomplete investigation.” Police arrested wrong man based on facial recognition fail, ACLU says | Ars Technica

Issues like this, and the problems with bias in face recognition against people of color, led many cities to ban the technology completely:

Boston will become the tenth city in the United States to ban government use of face recognition technology. Last year, the state of California passed a three-year moratorium on the use of FRT on police body-worn and hand-held cameras. Victory! Boston Bans Government Use of Face Surveillance | Electronic Frontier Foundation

FRT is just one of many great-in-theory-frightening-in-practice technologies we need to be wary of. Technology is often an imperfect solution to complex problems, but there are technologies that are essential to maintaining our privacy and security, and chief among them is strong encryption.

Strong encryption, the protection you use all day, everyday, on every device you own, is the backbone of the internet age. Without strong encryption, online banking, online shopping, cloud storage, even remote working, would not be safe. If data isn’t encrypted it can be stolen. Period. Full stop.

And protecting strong encryption is at the heart of protecting your privacy and security online.

Strong encryption is essential

Every time you shop online or through an app, strong encryption is protecting you. Every padlock you see in your browser’s address bar is strong encryption at work. Every private message you send online through any number of free end-to-end encrypted (E2EE) chat apps, is backed by strong encryption. Strong encryption makes sure the information to send to someone can’t be read by someone else. Your credit card information goes securely to Amazon and strong encryption protects that information as it goes to credit card processors. When you complete an application for a bank loan or pay a bill or deposit a cheque, strong encryption makes sure your banking information doesn’t fall into the wrong hands. Many people only think about strong encryption for chat apps, but that’s just a small fraction of what strong encryption is used for. Strong encryption is fundamental to how we use the internet.

Strong encryption is encryption that can’t practically be decrypted by brute force. You can’t load a string of encrypted text into a computer and have it spit out the real message—at least not for a few millennia. Strong encryption isn’t about keeping people from capturing the information you send, it’s about keeping them from reading it. Encrypted text is a mish-mash of letters and numbers and symbols and if you don’t have the key, it’s impossible to decrypt it.

Strong encryption has become so commonplace—like how your phone is protected with strong encryption as soon as you set a passcode—we forget it’s there. Like electricity, strong encryption powers our lives. Which also means when it comes under attack, we don’t see it for the grave threat it truly is.

Backdoors aren’t the answer

Last year, when strong encryption was in the political spotlight, I wrote our stance on the need for strong encryption and how encryption backdoors weren’t possible. Our stance hasn’t changed, and it won’t change. Even while the U.S. Senate debates not one, but two measures that threaten strong encryption and our privacy. Our stance has not, and will not, change.

On the two Senate bills, the EFF says this:

But if EARN IT attempts to avoid acknowledging the elephant in the room, the Lawful Access to Encrypted Data Act puts it at the center of a three-ring circus. The new bill doesn’t bother with commissions or best practices. Instead, it would give the Justice Department the ability to require that manufacturers of encrypted devices and operating systems, communications providers, and many others must have the ability to decrypt data upon request. In other words, a backdoor.

Worse yet, the bill requires companies to figure out for themselves how to comply with a decryption directive. Their only grounds to resist is to show it would be “technically impossible.” While that might seem like a concession to the long-standing expert consensus that technologists simply can’t build a “lawful access” mechanism that only the government can use, the bill’s sponsors are nowhere near that reasonable. As a hearing led by Senator Graham last December demonstrated, many legislators and law enforcement officials believe that even though any backdoor could be exploited by bad actors and put hundreds of millions of ordinary users at risk, that doesn’t mean it’s “technically impossible.” In fact, even if decryption would be “impossible” because the system is designed to be secure against everyone except the user who holds the key —as with full-disk encryption schemes designed by Apple and Google—that’s likely not a defense. Instead, the government can require the system to be redesigned. The Senate’s New Anti-Encryption Bill Is Even Worse Than EARN IT, and That’s Saying Something | Electronic Frontier Foundation

Lock on broken hasp.

Police and lawmakers make the argument that strong encryption lets criminals go dark. Police can’t surveil them or wiretap them or use any of the other tools already at their disposal. But that’s not really the case. The FBI recently arrested a man who made billions hacking companies using his own social media posts to confirm his identity and locate him:

In the affidavit, federal officials detailed how his social media accounts provided a treasure trove of information to confirm his identity. His Instagram, for example, had an email and phone number saved for account security purposes. Federal officials got that information and linked that email and phone number to financial transactions and transfers with people the FBI believed were his co-conspirators.

“The email account … also contained emails with attachments relating to wire transfers in large dollar values,” the affidavit said. His Apple and Snapchat records also provided information that helped investigators confirm his identity, address and communications with other suspects. Even his Instagram birthday celebration photos provided key information. One post displayed a birthday cake topped with a Fendi logo and a miniature image of him surrounded by tiny shopping bags. Investigators used that post to verify his date of birth on a previous US visa application. Ray Hushpuppi is accused of cyber crimes in two continents – CNN

Governments and police already have information about you and what they don’t have already, can be found with good police work and existing tools. Even the former chief of MI-5 is against cracking down on encrypted chat apps:

Acknowledging that use of encryption had hampered security agencies’ efforts to access the content of communications between extremists, Evans added: “I’m not personally one of those who thinks we should weaken encryption because I think there is a parallel issue, which is cybersecurity more broadly.

“While understandably there is a very acute concern about counter-terrorism, it is not the only threat that we face. The way in which cyberspace is being used by criminals and by governments is a potential threat to the UK’s interests more widely.

“It’s very important that we should be seen and be a country in which people can operate securely – that’s important for our commercial interests as well as our security interests, so encryption in that context is very positive.” Ex-MI5 chief warns against crackdown on encrypted messaging apps | Technology | The Guardian

Let’s say, for the sake of argument, companies are forced to create backdoors for law enforcement. Suddenly technology companies are standing on the edge of a very slippery slope where they are put in the position to become moral arbiters for requests to access and decrypt information. From this 2018 post:

To American lawmakers, it may seem logical to require that access should be provided to U.S. law enforcement in line with U.S. law, including, at a minimum, the receipt of a valid warrant. However, digital communications are international, as are the tech firms at the center of this debate. If Apple, for example, had the technical capability to circumvent encryption on their devices, and they had a policy of facilitating access for U.S. law enforcement when presented with a warrant, they would face tremendous pressure to provide equivalent services to other governments, and, in some cases, like China, even legal obligation to do so.

If a tech firm introduced a backdoor into its systems, it would therefore have two options: it could facilitate access to all governments equally, which would mean complicity in a wide range of human rights abuses, or they could commit to evaluating all requests for access on their merits and potential human rights impact. In the latter case, besides being manifestly unqualified to perform this role, such a stand would be very difficult for tech firms to maintain. The mere capability to facilitate backdoor access would subject companies to tremendous pressure, and a failure to comply would have high stakes: Chinese law has no upper limit on the fines they can charge for non-compliance with government demands for access. A non-compliant company could also risk losing access to that market entirely, or even seeing employees jailed or harmed. Given these alternatives, it is not surprising that tech firms have thus far sought to avoid these results by maintaining a technical inability to provide access to their users’ secure communications. The real costs of backdoors would be born by ordinary people, and global tech firms who will have to shoulder the financial and moral cost of supporting repressive governments.

Why An Encryption Backdoor for Just the “Good Guys” Won’t Work – Just Security

Even without this issue, we hold, as do most experts, that once a backdoor is created it’s only a matter of time before it’s exploited by the people it’s supposed to protect us from. And where will we draw the line? Many of the technologies, like SSL, TLS, and AES are used for more than messaging. All of those are used to secure transactions and connections between banks, between government agencies, and protect infrastructure like power and water.

What would the cost be when someone figures out the backdoor put in to let law enforcement decrypt messages on demand, can also be used to compromise the North American power grid?

It’s just not worth the risk.

Weakened encryption means the end of privacy

If a message can be decrypted by someone else—police, government, a company—it’s not a private message any more. It’s a postcard with a piece of wax paper taped over the message. You can’t read the message at a glance, but remove the wax paper and everything is laid bare.

There have been cases innumerable cases of staff at technology companies abusing their access to customer data. Examples from Google, Snap, and Twitter are just the recent ones. Now imagine if companies had the power to read encrypted messages stored on their servers? How long before that was abused? How long before employees or law enforcement were compromised by organized crime or other countries?

Lorenz SZ42 – used to generate the encrypted traffic.
Photo by Alex Motoc on Unsplash

Why should anybody be worried about security online? The vast majority of us are not criminals or terrorists.
Privacy is a fundamental Human Right, and it is so for a reason. Everybody needs a way to communicate confidentially online, just like everyone sometimes feels the need to whisper with friends and family. Private communication is also much needed in any democracy to enable a free and open political discussion without having to fear of being judged for ones opinion. This is crucial to make a democracy work.

Hanna Bozakov, Head of Marketing at Tutanota, via HackerNoon

Not very long.

Yes, people will say there will be strict policies in place. There will be controls. Legal checks and balances will prevent abuse. We know all the policies in the world won’t stop people determined to get around them. Policies were in place at Google, Snap, and Twitter.

Policies don’t do enough. The only thing that keeps encrypted messages private is if they can’t be decrypted by other people.

We stand for privacy

At Sky Global we, from our employees to our reseller partners, are united in a single belief that your privacy matters. Your privacy is worth protecting and we have tools to help you do that. We respect our customer’s privacy. We don’t ask your business and don’t want to know your business. We don’t need to know who you are.

We believe privacy is a human right. We believe to have a free society, privacy and security must be protected.

Privacy is a human right. To have a free society, privacy and security must be protected. We cannot have encryption backdoors in any system—our privacy is too important to risk the cost. Click To Tweet

We hope you will stand with us in protecting online privacy, strong encryption, and the freedom to live without government surveillance of our daily lives. If you’d like to read how we’ve put these believes into place in SKY ECC, our post on we believe you have a right to privacy.

Asset 1

Protect Your Privacy with SKY ECC

Secure devices

Private, encrypted mobile data network

Encrypted network communications

Brute-force protections

Private, encrypted mobile data network

Strongest encryption of any secure communications app