Category: security
The Death of Trust
The tl;dr: Assume that anything you do online is being recorded by the government.
I had a conversation this past week with one of my students who was interested in some of the operational aspects of anonymity; he wanted to know to what extent either Tor or a VPN or both would protect his identity against varying levels of potential adversaries, from coworkers to nation-states. I think that we here in the USA forget that in many parts of the world, speech, especially dissident speech, can be extremely dangerous.
A recurring theme of this conversation was the notion of trust. For example, when we talked about how VPNs work and how they might be used to secure communications like IM or email, it came down to the level of trust that you have in the VPN provider. What if that provider is logging everything that you do across the VPN? Is the VPN provider susceptible to a Five-Eyes warrant to turn over those logs, or being monitored covertly? How do you know that the VPN provider isn’t really a government agency?
You don’t.
On January 21st 2017, literally millions of people united in marches across the country protesting against an administration that they see as a threat to their freedoms. Those protests were organized and promoted on sites and services such as Twitter, Facebook, and Google without, I’m guessing, much thought about who else might be collecting and collating this information. We willingly expose enormous amounts of information about ourselves, our thoughts, and our actions on these sites every single day. Can we trust them?
We can’t.
Edward Snowden showed us how deeply entrenched US intelligence agencies are in these sites, collecting, storing, and indexing nearly every message that flows through them. A body of secret law, interpreted by a court that meets in secret, ensures that these agencies can collect nearly anything that they ask for.
We have to assume that all of the email, texts, phone calls, and posts relating to today’s protests have been collected.
Do we care? On some level I suppose we don’t. We use these services, the Facebooks, the Twitters, the GMails, because they are convenient and efficient at reaching large numbers of people very quickly. For a large portion of our population, the ‘internet’ is Facebook. We post and tweet and like, not realizing that these posts and tweets and likes are used to create profiles of us, primarily for marketing purposes, but also for analysis by our government. I’m not saying that the NSA has a folder labeled ‘Perry Donham’ with all my posts and tweets collated in realtime, but I am saying that the data is there if an analyst wants to sort through it.
A photo from today’s march in Washington really struck me: Japanese woman at Washington protest 21 January 2017. In it an elderly Japanese woman holds a sign that reads Locked Up by US Prez 1942-1946 Never Again! There are US citizens still alive who were put into detention camps by the US government during the second world war. George Takei, a US citizen who played Sulu on the iconic series Star Trek, was imprisoned by the US government from the age of five until the age of eight. The reason? He was of Japanese descent.
We are entering unknown political territory, with an administration guided by the far right that will wield enormous technical surveillance assets. We literally don’t know what is going to happen next. It’s time to think carefully about what we say and do, and who we associate with, online, in email, posts, tweets, texts, and phone calls. We know that this data is being collected now by our government. We don’t know what the Trump administration will choose to do with it.
My advice is simply this: Every time you post or tweet or like, assume that it is being collected, analyzed, stored, and can be potentially used against you.
Worse, we’ve become dependent on ‘the cloud’ and how easy it is to store our information on services such as Dropbox, Google Docs, and Azure. Think about this. Do you know the management team at Dropbox? The developers? The people running the data Dropbox data center? Their network provider? You do not. The only reason that we trust Dropbox with our files is that ‘they’ said that ‘they’ could be trusted with them.
You might as well drive over to your local Greyhound terminal and hand an envelope with your personal files in it to a random person sitting on a bench. You know that person just as well as you do Dropbox.
I’ve been thinking a lot about trust and how false it is on the internet, and about how little we think about trust. In the next few posts I’ll look at how the idea of trust has broken down and at how we can leverage personal trust in securing our communications and information.
I still think WhatsApp has a security problem
Last week The Guardian ran a story that claimed a backdoor was embedded in Facebook's WhatsApp messaging service. Bloggers went nuts as we do when it looks like there's some nefarious code lurking in a popular application, and of course Facebook is a favorite target of everybody. I tweeted my disdain for WhatsApp moments after reading the article, pointing out that when it comes to secure communication, closed-source code just doesn't cut it.
Today Joseph Bonneau and Erica Portnoy over at EFF posted a very good analysis of what WhatsApp is actually doing in this case. It turns out that the purported back door is really a design decision by the WhatsApp team; they are choosing reliability over security. The quick explanation is that if a WhatsApp user changes his or her encryption key, the app will, behind the scenes, re-encrypt a pending message with the new key in order to make sure it is delivered. The intent is to not drop any messages.
Unfortunately, by choosing reliability (no dropped messages), WhatsApp has opened up a fairly large hole in which a malicious third party could spoof a key change and retrieve messages intended for someone else.
EFF's article does a very good job of explaining the risk, but I think it fails to drive home the point that this behavior makes WhatsApp completely unusable for anyone who is depending on secrecy. You won't know that your communication has been compromised until it's already happened.
Signal, the app that WhatsApp is built on, uses a different, secure behavior that will drop the message if a key change is detected.
Casual users of WhatsApp won't care one way or another about this. However, Facebook is promoting the security of WhatsApp and implying that it is as strong as Signal when it in fact isn't. To me this is worse than having no security at all...in that case you at least know exactly what you are getting. It says to me that Facebook's management team doesn't really care about security in WhatsApp and are just using end-to-end encryption as a marketing tool.
Signal has its own problems, but it is the most reliable internet-connected messaging app in popular use right now. I only hope that Facebook's decision to choose convenience over security doesn't get someone hurt.
What I’m Using for Privacy: Cloud
This post is part of a series on technologies that I’m currently using for privacy, and my reasons for them. You can see the entire list in the first post.
tl;dr: I don't trust anyone with my data except myself, and neither should you.
If you aren't paying for it, you are the product
I think that trust is the single most important commodity on the internet, and the one that is least thought about. In the past four or five years the number of online file storage services (collectively 'the cloud') went from zero to more than I can name. All of them have the same business model: "Trust us with your data."
But that's not the pitch, which is, "Wouldn't you like to have access to your files from any device?"
A large majority of my students use Google Docs for cloud storage. It's free, easy to use, and well integrated into a lot of third-party tools. Google is a household name and most people trust them implicitly. However, as I point out to my students, if they bothered to read the terms of service when they signed up, they know that they are giving permission to Google to scan, index, compile, profile, and otherwise read through the documents that are stored on the Google cloud.
There's nothing nefarious about this; Google is basically an ad agency, and well over half of their revenue is made by selling access to their profiles of each user, which are built by combining search history, emails, and the contents of our documents on their cloud. You agreed to this when you signed up for the service. It's why you start seeing ads for vacations when you send your mom an email about an upcoming trip.
But isn't my data encrypted?
Yes and no. Most cloud services will encrypt the transmission of your file from your computer to theirs, however when the file is at rest on their servers, it might or might not be encrypted, depending on the company. In most cases, if the file is encrypted, it is with the cloud service's key, not yours. That means that if the key is compromised or a law-enforcement or spy agency wants to see what's in the file, the cloud service will decrypt your file for them and turn it over. Warrants, in the form of National Security Letters, come with a gag order and so you will not be told when an agency has requested to see your files.
Some services are better than others about this; Apple says that files are encrypted in transit and at rest on their iCould servers. However, it's my understanding that the files are currently encrypted with Apple's keys, which are subject to FISA warrants. I believe that Apple is working on a solution in which they haven no knowledge of the encryption key.
You should assume that any file you store on someone else's server can be read by someone else.
Given that assumption, if you choose to use a commercial cloud service, the very least you should do is encrypt your files locally and only store the encrypted versions on the cloud.
And....they're gone
Another trust issue that isn't brought up much is whether or not the company you are using now to store your files will still be around in a few years. Odds are that Microsoft and Google and Apple will be in business (though we've seen large companies fail before), but what about Dropbox? Box? Evernote? When you store files on any company's servers, you are trusting that they will still be in business in the future.
My personal solution
I don't trust anyone with my data except myself. I do, though, want the convenience of cloud storage. My solution was to build my own personal cloud using Seafile, an open-source cloud server, running on my own Linux-based RAID storage system. My files are under my control, on a machine that I built, using software that I inspected, and encrypted with my own secure keys. The Seafile client runs on any platform, and so my files are always in sync no matter which device (desktop, phone, tablet) I pick up.
The network itself is as secure as I can manage, and I use several automated tools to monitor and manage security, especially around the cloud system.
I will admit that this isn't a system that your grandmother could put together, however it isn't as difficult as you might think; the pieces that you need (Linux server, firewall, RAID array) have become very easy for someone with just a little technical knowledge to set up. There's a docker container for it, and I expect to see a Bitnami kit for it soon; both are one-button deployments.
Using my own cloud service solves all of my trust issues. If I don't trust myself, I have bigger problems than someone reading through my files!
What about 'personal' clouds?
Several manufacturers sell personal cloud appliances, like this one from Western Digital. They all work pretty much the same way; your files are stored locally on the cloud appliance and available on your network to any device. My advice is to avoid appliances that have just one storage drive or use proprietary formats to store files...you are setting up a single point of failure with them.
If you want to access your files anywhere other than your house network, there's a problem: The internet address of your home network isn't readily available. The way that most home cloud appliances solve this is by having you set up an account on their server through which you can access your personal cloud. If you're on the road, you open up the Western Digital cloud app, log on to their server, and through that account gain access to your files.
Well, here's the trust problem again. You now are allowing a third party to keep track of your cloud server and possibly streaming your files through their network. Do you trust them? Worse, these appliances run closed-source, proprietary software and usually come out of the box with automatic updates enabled. If some three-letter agency wanted access to your files, they'd just push an update to your machine with a back door installed. And that's assuming there isn't one already installed...we don't get to see the source code, so there's no way to prove there isn't one.
I would store my non-critical files on this kind of personal server but would assume that anything stored on it was compromised.
Paranoia, big destroyah
The assumption that third parties have access to your files in the cloud, and that you should assume that anything stored in the cloud is compromised, might seem like paranoia, but frankly this is how files should be treated. It's your data, and no one should by default have any access to it whatsoever. We certainly have the technical capability to set up private cloud storage, but there apparently isn't a huge market demand for it or it we'd see more companies step forward.
There are a few, though offering this level of service. Sync, a Canadian firm, looks promising. They seem to embrace zero-knowledge storage, which means that you hold the encryption keys, and they are not able to access your files in any way. They also seem to not store metadata about your files. Other services such as SpiderOak claim the same (in SpiderOak's case only if you only use the desktop client and do not share files with others).
I say 'seem to' and 'claim to' because the commercial providers of zero-knowledge storage are closed-source...the only real evidence we have to back up their claims is that they say it is so. I would not trust these companies with any sensitive files, but I might use them for trivial data. I trust Seafile because I've personally examined the source code and compiled it on my own machines.
Bottom line
I can't discount the convenience of storing data in the cloud. It's become such a significant part of my own habits that I don't even notice it any more...I take it for granted that I can walk up to any of my devices and everything I'm working on is just there, always. It would be a major adjustment for me to go back to pre-cloud work habits.
I have the advantage of having the technical skills and enough healthy skepticism to do all of this myself in a highly secure way. I understand that the average user doesn't, and that this shouldn't prevent them from embracing and using the cloud in their own lives.
To those I offer this advice: Be deliberate about what you store on commercial cloud services and appliances. Understand and act on the knowledge that once a file leaves your possession you lose control of it. Assume that it is being looked at. Use this knowledge to make an informed decision about what you will and will not store in the cloud.
Rethinking PGP encryption
Filippo Valsorda wrote an article recently on ArsTechnica titled I'm Throwing in the Towel on PGP, and I Work in Security that really made me think. Filippo is the real deal when it comes to PGP; few have his bona fides in the security arena, and when he talks, people should listen.
The basic message of the article is the same one that we've been hearing for two decades: PGP is hard to use. I've been a proponent since 1994 or so, when I first downloaded PGP. I contributed to Phil Zimmerman's defense fund (and have the T-shirt somewhere in my attic to prove it). As an educator I've discussed PGP and how it works with nearly every class I've taught in the past 20 years. I push it really hard.
And yet, like Filippo, I receive two, maybe three encrypted emails each year, often because I initiated the encrypted conversation. Clearly there's an issue here.
Most stock email clients don't support PGP. Mail on MacOS doesn't. I'm pretty sure that Outlook doesn't. I use Thunderbird because it does support PGP via a plugin. I really don't get this...email should be encrypted by default in a simple, transparent way by every major email client. Key generation should be done behind the scenes so that the user doesn't have to even think about it.
We might not ever get there.
And so, after 20 years of trying to convince everyone I meet that they should be using encryption, I, like Filippo, might be done.
However, there is a use case that I think works, and that I will use myself and educate others about. I've digitally signed every email that I send using PGP for several years, and I think that it might be the right way to think about how we use PGP. Here's the approach, which is similar to what Filippo is thinking:
- I will continue to use PGP signatures on all of my email. This provides nonrepudiation to me. I will use my standard, well-known key pair to sign messages.
- When I need to move an email conversation into encryption, I'll generate a new key pair just for that conversation. The key will be confirmed either via my well-known key pair or via a second channel (Signal IM or similar). The conversation-specific keys will be revoked once the conversation is done.
- I will start to include secure messaging ala Signal in my discussions of privacy
Nonrepudiation is really a benefit to me rather than anyone receiving my messages and I don't see any reason not to use my published keys for this.
Secure apps like Signal I think are more natural than bolting PGP onto email and are easier for non-tenchical users to understand. Further, the lack of forward secrecy in PGP (and its inclusion in Signal) should make us think twice about encrypting conversations over and over with the same keys rather than using a new set of keys for each conversation.
I think this approach will do for the time being.
[Update: Neil Walfield posted his response to Filippo's article; the comments are a good read on the problems we're facing with PGP. ]