Amnesty International’s Tanya O’Carroll on privacy & the ‘nothing to hide, nothing to fear’ argument

By Ajay Sandhu 

I recently interviewed Tanya O’Carroll, a Technology and Human Rights advisor at Amnesty International, to discuss government surveillance and its impact. I framed our discussion around the most common response researchers studying surveillance receive from the public: the “nothing to hide” argument. The nothing to hide argument alleges that government surveillance programs serve a security purpose and should not to be opposed by innocent people. This blog outlines O’Carroll thoughts about the nothing to hide argument and it’s flaws, the importance of privacy rights, and the ‘encryption mentality’ that she thinks should replace the nothing to hide argument.

Explaining the Nothing to Hide Perspective

The tone of O’Carroll’s work changed radically in June 2013, when Edward Snowden revealed the extent of the US and UK governments’ surveillance program. “I remember it literally almost like a war room straight afterwards,” O’Carroll recalled, “It was time to massively beef up our work, to ask ‘what are we doing to tackle technology as a human rights issue.’” Since then, O’Carroll’s work has attempted to raise awareness about the “dark side” of technology by examining how digital and online technologies in particular are used to conduct bulk surveillance.

Unfortunately, O’Carroll faces a significant barrier; the public have dismissed arguments exposing the harms of bulk surveillance and stated that they have “nothing to hide” and therefore “nothing to fear” from government monitoring. “That’s the first thing that most people say to us,” explained O’Carroll, “all of the comments over and over again are saying ‘yeah, okay, fine, it might be an abstract violation of this abstract right called the right to privacy, but, if I’m not doing anything wrong, why do I really care whose looking at my completely boring and mundane text messages and emails.’”

O’Carroll speculates that the nothing to hide perspective is born of a lack of information about how surveillance data is used to control. Most of us do not realise, she explained, that the employment opportunities we receive, the insurance prices we pay, the treatment we receive from police, are increasingly the result of decisions made by algorithmic analyses of surveillance data. Most of us also do not realise, O’Carroll added, that these decisions can facilitate discriminatory hiring practices, exploitative pricing, and invasive police monitoring (see Weapons of Math Destruction by Cathy O’Neil) Even the political information we receive can be determined by algorithms which personalise the political news that appears on our websites. The result, “…doesn’t look like propaganda,” O’Carroll clarified, “it doesn’t look like a big democrat or republican or Brexit campaign poster on the side of the street. It looks like a post on your Facebook feed that is directly appealing to your tastes and interests.” Despite the seemingly harmless look, O’Carroll continued, these Facebook posts can have a significant influence on our political leanings and voting behaviour by creating “filter bubbles” that reinforce pre-existing biases and political polarisation.

Unfortunately, O’Carroll admitted, few consider such issues. There is a lot of “terrorism theatre,” she explained, which tells the public that regulations limiting bulk surveillance can undermine their safety and security. As a result, the public can be quite passive in the wake of government surveillance programs. This may also be a consequence of “…a failing of some us digital and privacy advocates,” O’Carroll added, “we’ve been so stuck in the sort of abstract or theoretical debate about privacy that we’ve failed to communicate its importance to people.”

The Importance of Privacy

Furthermore, when considering the value of privacy, O’Carroll added, it is important to remember the circumstances of those who face a disproportionate amount of surveillance. “The big eye in the sky is not aimed equally at everyone,” she explained, “I say this to my friends and my family every time the debate comes up. I defend [privacy rights] not just for myself. I defend [privacy rights] because there are other individuals who are unfairly treated as suspicious by the state.” The value of privacy then, can be found in how it serves those who are the most disadvantaged among us, those most likely be targeted by intrusive surveillance. To argue that surveillance is harmless and should be tolerated is a privileged position which ignores the experiences of the disadvantaged. “I think it is the time to put a battle cry out for privacy again,” O’Carroll concluded, “[…] it is the time for us to really stand up for the right to privacy.”

Encryption Mentalities

Standing up for the right to privacy can involve changing how we vote, joining pro-privacy protests, and/or writing to our local political representatives. However, it need not be so formal. Standing up for privacy rights can also involve changing our everyday behaviour by obstructing government surveillance. According to O’Carroll, this means developing a new mentality to replace the nothing to hide perspective.

To illustrate this, O’Carroll reflected on the mentality, which she later called an “encryption mentality,” that she’s developed since the Snowden revelations. She started by offering an analogy concerning how our attitudes about the safety of seatbelts have developed over time. “We’ve evolved to understand that if you walk out in front of a moving large piece of metal, also known as a bus, it is bad. So, you don’t want to do that. We didn’t necessarily evolve the mentality that when we are in a car, a seatbelt helps protect us. It’s a more abstract idea, because you can sit in a car and feel quite comfortable not wearing a seatbelt. We have to hammer it into our head, combined with law, that we have a better chance of surviving if we are wearing a seatbelt.” Overtime, O’Carroll explained, “people develop a feeling that when you get into a car, if you don’t wear your seatbelt, you feel exposed.”

Similarly, O’Carroll continued, she has developed a feeling of being exposed when she does not encrypt her emails or texts. “I have reached a point where I feel like I’m not wearing my seatbelt when I’m not encrypting things. I think I probably use end-to-end encryption for 20 to 30 percent of all of my communications now. It’s second nature for me to encrypt.” This is the mentality that she thinks should replace the nothing to hide argument. “I think that is where we need to get to as a society, so it becomes second nature, like wearing a helmet on a bike, or a seatbelt in your car. You don’t have to do it all the time, but you start to want to and you feel safer when you do.”

To help hammer home her message about the harms of surveillance and the importance of encryption, O’Carroll is working on a project which challenges governments’ claims that surveillance is not a human rights concern. “We are set to show that there is harm and that it is not just that the right to privacy is violated…it can also be discrimination. Not everybody is equal in the eyes of surveillance, and it disproportionately impacts certain communities. So, we are looking specifically at the impacts of mass surveillance programs in the police and security sectors and the impacts that those powers […] on already over-policed communities.”

You can find out more about Tanya O’Carroll’s work about the human rights concerns raised by surveillance and big data by following her on Twitter.


Disclaimer: The views expressed herein are the author(s) alone.

Advertisements