By Josh Woodard, mSTAR Technical Advisor
Next year is the 20th anniversary of Time Magazine’s declaration of the “death of privacy”.
Of course, privacy never died, but it has been under continuous pressure from pretty much everywhere, particularly in the digital realm. Our every action is in essence under constant surveillance from many of the digital platforms that we use every day. The insights digital platforms glean from our personal data and habits serve as a core component of their revenue generation. We are literally trading our privacy in exchange for using their services. This will only get worse as the Internet of Things expands to include ‘always on’ devices that passively listen to conversations—like next generation personal assistants that monitor our households—think of Samsung’s recently released smart refrigerators.
We are literally trading our privacy in exchange for using their services.
While one can make the case that we voluntarily sign up to these services and their terms, the reality is that not all companies are adequately protecting our data. In 2015 alone, almost 500 million identities were exposed by corporate data breaches. A study by PwC found that only 37 percent of companies have a cyber incident response plan. This is all not to mention that a number of technology companies have actively permitted governments to access their clients’ personal data, including Yahoo, which created a backdoor to its email servers for the NSA.
It’s Not Just the Surveillance Economy
Government intrusion into internet privacy is not unique to the United States. Freedom House’s Freedom on the Net 2015 report found that internet freedom declined for the fifth year in a row, and that “governments in 14 of 65 countries passed new laws to increase surveillance over the past year.” China has even tested out a social credit system that collects personal data and assigns people “scores” as citizens. While that may sound benign, just imagine how that data can impact those in society deemed to have a lower social credit score, such as minorities or those with unpopular political opinions. Moreover, cybercrime laws in a number of countries are being used to criminalize certain types of free expression.
One study found that the two most popular passwords are ‘123456’ and ‘password.’
As digital users, we’re certainly not helping ourselves much either. One study found that the two most popular passwords are ‘123456’ and ‘password.’ More than half of people in a recent experiment conducted by German researchers clicked on a link from an unknown sender. It should come as no surprise then that millions of people—including my mother, twice!— are victims of cyberscams annually. Our worst impulses also sometimes manifest themselves in online vigilantism and public shaming that threaten the anonymity of average citizens. One such example is the Saudi student in the United States whose photo went viral on social media in the hours and days after the Boston marathon bombing by people claiming he was a key suspect. The student, it turns out, had nothing to do with the attacks.
Development Actors and Privacy Naiveté
Given this backdrop, one would think that development organizations, which often work with vulnerable populations and are supposed to be serving their interests, would place a higher premium on protecting individual privacy. The reality, however, is that often times, as development organizations, we are completely oblivious to privacy issues. Privacy controls, like informed consent, are often done simply to check a box. Development organizations consider it an extra step to encrypt and store personal data, and frequently share it with others fairly cavalierly. What’s worse is that we too often skim the terms of services we are using to collect people’s information. For example, an NGO in Papua New Guinea once shared with me how they had paid a data collection firm to conduct mobile surveys of their program participants. This NGO did not realize that the terms of the agreement entitled that firm to keep those participants’ contact information and sell it out to third-parties to conduct their own surveys.
All it Takes is a Little Effort
While all of this may seem doom and gloom, protecting people’s privacy is actually not that hard, it just takes effort. Development organizations can start by putting greater emphasis on training their staff on how to handle data privately and securely, along with promoting concepts such as individual sovereignty over personal information. Some fixes are easy, like improving informed consent and data management. The Responsible Data Handbook is a great resource for much of that. And for more complicated issues, like secure data storage and communications, other useful resources, such as Security-in-a-Box, exist.
Protecting people’s privacy is actually not that hard…
Development practitioners should also stay abreast of changes that could impact privacy. In addition to following news reports on technology and privacy, organizations such as Freedom House, Privacy International, Reporters Without Borders, and local netizen groups are all helpful sources of information. The Responsible Data listserv is also a place to learn from and share with fellow development practitioners.
For those who work at the policy level, the EU has been leading the way in developing protections for their citizens’ privacy and control over their personal data. There is much that can be learned from their work. For the practitioners among us, look for digital platforms and service providers that emphasize security, privacy, and transparency. It is also worth keeping an eye on shared ownership platforms, which while not necessarily more secure or private, are at least potentially more accountable to their users (who are also generally their owners) than privately-held companies.
Finally, it is crucial for those of us who care about privacy and individual sovereignty to make their voices heard. Share with colleagues why they should take these issues more seriously and the relatively simple adjustments they can make to do so. Express your concerns about these issues with digital platform providers and local governments to help to make sure that these issues are on their radar. Education is the first step towards creating a culture of smart privacy protection in development, and I’d encourage each of you to start taking that step today.
The following blog post was adapted from a presentation given by the author at USAID’s Next Generation Technologies for Empowering People event in Bangkok, Thailand on November 14-15. It is not meant to be comprehensive in its analysis of this complex topic, but rather to be a starting point for conversation.
Josh Woodard serves as a technical advisor for the mSTAR project, where he oversees technical quality and provides technical direction to several activities in Asia focused on digital development, including digital financial services. He also led mSTAR’s efforts to organize and facilitate the Data for Resilience Summit.