No, technology is not going to destroy your privacy in the futureS

Welcome to Privacy Club. You cannot find Privacy Club on the internet.

Look for stencils on the sidewalks to find this week's key to the encrypted directions — as usual, we've stashed them steganographically inside a photo posted in a 4Chan forum devoted to politicians having sex with donkeys.

You cannot bring your mobile to Privacy Club. While you are at Privacy Club, we ask you to find a reasonable thing for your phone to be doing so that it appears you have gone to a place other than this meeting. Leave it on your desk at work; send it out to lunch around the corner. You must walk to Privacy Club or take public transit. When you do, put on a hat and avoid heavy CCTV zones. If you normally wear makeup, don't. If you normally wear business clothes, dress like a punk. If you normally dress like a punk, put on a tie.

If anybody finds out who we are or that we've met, you've just killed this cell of Privacy Club. See you there.

In just a few years, I can easily imagine people forming a Privacy Club with rules just like this. Maybe it would be for fun, or maybe it would be a matter of life and death. Either way, it would be one possible response to a world where the rules of privacy have been completely rewritten by surveillance technology and the law.

You've heard the arguments already. On PBS News Hour, on NPR, and in shiny books published by serious-minded New York publishers, we keep hearing this refrain: Social media and CCTV have stolen our private lives, and we'll never get them back.

There are two reasonable responses to this assertion: 1) Who cares if they have? and 2) No they haven't. Both turn out to be true. Let's figure out why.

The "Transparency" Fallacy

Why should we worry that our private lives are on display online and through the lenses of security cameras? For decades now, science fiction author David Brin has been arguing that technology is pushing us towards a "transparent society" where we lose our privacy but gain "transparent" systems of authority to make up for it.

A transparent authority is any powerful institution, usually a government or a corporation, whose rules and machinations are revealed to the public routinely as a result of investigative reporting, leaks, and self-disclosure. Brin's idea is that in a society with less privacy, we lose the deadly secrecy that once allowed powerful people to victimize the less powerful. This is essentially his argument in the nonfiction book The Transparent Society, and his ideas have changed little since then — in a recent essay for Silicon Valley Metro, he restated this idea more forcefully than ever.

Ethicist Peter Singer makes a similar point in in his recent Harper's magazine essay "Visible Man: Ethics in a World Without Secrets." Both Singer and Brin suggest that the technologies of surveillance from CCTV to Facebook will shine light equally on the powerful and powerless alike. They both hail Wikileaks as the supreme example of how transparency works in a democratic society. Sure, the Feds can snoop on your Facebook friends, but activists are snooping on the Feds, too. Why should you worry that Google is gathering tons of data on your personal email, spending habits, friendship networks, and physical location? Pranksters like Lulzsec will leak every Google employee's data to you eventually, too.

Singer goes so far as to say, essentially, that data-gathering doesn't create authoritarian regimes — undemocratic people do. This is basically the crux of his argument, and Brin agrees. That's why they're so sanguine about the loss of privacy. And that's where they're wrong.

Pretty much every authoritarian regime has used data gathering techniques to shore up its power, going all the way back to Imperial Rome with its census, which was used to track taxpayers, slaves, and people who lived in areas Rome conquered, like Gaul. After all, how do you maintain a good imperial system without tracking that kind of data? With the help of census-takers, Rome could maintain an historically unprecedented grip on its citizens, even the far-flung ones.

Authoritarian governments from Hitler's Germany to Slobodan Milošević's Yugoslavia have left enormous paper trails behind which reveal how surveillance and data retention were key to their genocidal projects (human rights data analyst Patrick Ball has written extensively about this). Certainly there are examples of benevolent data-gathering, like population genetics studies in medicine where personal information is removed. But by and large, powerful political institutions on the cusp of going darkside usually start in with the surveillance-based data-gathering. One could easily argue that surveillance and data-mining on a massive scale are symptoms of authoritarian regimes.

Privacy Is Unevenly Distributed

The other problem is that transparent societies are more often societies where privacy is unevenly distributed. Most people have very little privacy, but the rich and powerful can pull the curtains to hide whatever secrets they don't want revealed. Already, we've seen how the political regime in Egypt shut down transparency technologies like the internet when they wanted to hide what they're doing from the world. On a smaller scale, police working for San Francisco's metro system, BART, illegally shut down cell phone access in their stations on a day when they suspected people might be gathering there to protest a BART police shooting.

To tackle this same issue more mundanely, consider this: Can you find out what corporations, law enforcement agencies, and politicians are doing with the same ease as you can find out what random strangers are doing by using Facebook, FourSquare, and Google? No? Then privacy in your society is distributed unevenly.

Groups like Wikileaks and Lulzsec, which do rip back the curtains on secrets hidden by the powerful, are exceptions that prove the rule. Yes, they manage to wrest documents and private information from the hands of secretive organizations. But they are not the norm. We probably won't ever have enough Wikileaks-style organizations to balance out all the personal information we lose control over every day, just from our images being grabbed by hidden cameras alone.

And the ultimate irony is that even security experts aren't certain that sacrificing our privacy can help prevent crime. Indeed, as computer security expert Bruce Schneier has explained, it's mathematically impossible for data mining to actually help catch terrorists. So it's likely we're not gaining any security benefits for our loss of privacy, either.

But is this situation actually that bad? What's the big deal if everybody on the internet knows you go to a certain restaurant every day for lunch, or can figure out who your boyfriend is this week?

Privacy vs. Secrecy

One of the problems with the debate over privacy is that people tend to confuse the concept of "private" with the concept of "secret." There are definitely similarities between the two, but only secrets have the power to harm others. Private information is usually made up of details you'd prefer others not to know, and can vary quite widely from person to person. I personally might not care if you can see pictures of me making out with my girlfriend or boyfriend online; but somebody else would consider those pictures very private. Indeed, people who work with children might even classify makeout photos to be secrets. That's because teachers have lost their jobs after parents discovered "racy" pictures of them online. An example like that demonstrates how easy it is to slide from privacy, a personal choice, to secrecy, a concept with graver implications.

The point is that privacy varies from situation to situation, and from group to group. There are stories I share with my friends that I would like to keep private from the general public. I might not suffer any specific harm if strangers find out that I'm dieting, for example, but I'd feel uncomfortable if everybody knew. The situation with a secret is very different. Governments keep secrets not just because it's more comfy, but because revealing them might get people killed — or might reveal the government was so corrupt that its citizens would riot and stage a coup. A secret is a piece of information that could get you fired, or prevent you from getting life-saving health insurance.

That's why we haven't achieved a fully "transparent" society just because we know that President Obama likes to read comic books and Steve Jobs might dash off a personal email to us late at night. We might be getting small peeks at these powerful people's private lives, but their secrets — the information that could truly affect our lives, and theirs — remain cloaked. But the average citizen's secrets are being compromised. Our physical locations are broadcast to both private companies and government via CCTV and our mobile phones; and, with just a quick subpoena, law enforcement can get access to a lot of our hidden information on social networks. Our credit card data is routinely sent over the internet insecurely; our health information is kept in easily-compromised databases in hospitals.

None of this is likely to change any time soon. You need to give up on the idea that nobody will see information that's merely private, hidden from view for comfort. People are going to see your drunk pictures. Once in a while, a CCTV camera will catch you naked in the changing rooms. And sometimes, we'll get to see a politican's cock on Twitter. Yes, it's awkward, but most people will be unharmed.

By the same token, we need to cling fiercely to our secrets. This means actually figuring out what's absolutely secret and keep it from prying eyes. Your drunk pictures are not the same as your health care records. But how do we give up on privacy and still keep secrets? Just figuring out the privacy settings on Facebook can take all day? Not to mention all the databases full of credit card information that you don't control?

This question becomes even more complicated when you consider that some secrets, such as deadly ones held by authoritarian regimes, need to be unveiled to prevent harm coming to the public.

How to Start Your Own Privacy Club

Let's return to the thought experiment I began with, the Privacy Club that's formed in our near future to cope with a world where maintaining privacy is so hard that it's become an almost impossible game to have lunch with your friends without somebody (or some machine) creating a record of it. What happens to privacy and secrets in such a world?

First of all, though I said earlier that you can just kiss your comfy privacy goodbye, that's not entirely going to be the case. You won't be able to have privacy like your grandparents did in their 1950s homes with the curtains drawn and no CCTVs on Main Street. Instead, you'll stop thinking about privacy as something you get in a particular space, like a home or bathroom. Instead, privacy will become temporal and temporary. You might use cell phone dampeners to create a mobile blackout zone, or throw a party in the dark so that every time somebody takes a picture they have to use a very obvious flash (and therefore they won't).

So privacy will be something you do for a period of time, not in a specific place.

The same goes for privacy in the online world. To zoom around anonymously on the internet, you'll want to create a temporary, throwaway identity — something you use only once or twice that can't be traced back to your Facebook account and your Google+ circles. You won't aim to leave no record behind - because who can do that? - but you can leave an unverifiable record. Plausible deniability will become your online privacy. Was that person who spouted off about your employer in an online forum you? Who knows? That person didn't have your name, didn't log on from your computer, and never showed up again. Same goes for the person who looked at all that foot fetish porn.

Deniable, temporary identities are part of a privacy strategy that social media analyst danah boyd calls "social steganography." You hide your true self in a blur half-selves who express ideas that it might be awkward and uncomfortable to show everybody. Another social steganography strategy is to express yourself using references or in-group jokes that only a certain group of friends will understand. Again, the issue is deniability; your veiled comments can be interpreted in a number of different ways that can't be pinned down. Basically, we're moving from hiding in the dark (in an isolated place) to hiding in the light (creating social noise that hides your private actions and thoughts).

Privacy isn't gone. It just looks a lot different than it once did. At the same time, we're also going to have to get used to the idea that loss of privacy is as temporary as privacy itself. You may lose it for a little while, but you'll find a place to hide again.

Secrecy is another matter. In many ways, our struggle in the twenty-first century is getting used to the idea that we should only let go of our privacy if we can get some guarantee that our secrecy will remain intact. But as long as pundits and governments continue to blur the lines between what's private and what's secret, we'll be trying to solve the wrong problem. Our goal should not be a private world, or a transparent society. It should be a public sphere where everyone, especially the vulnerable, can keep a secret.

I write my editor's column while sitting inside a temporary faraday cage. You can read past columns here.

Top illustration from album art on AC/DC's Dirty Deeds Done Dirt Cheap