There is a considerable divergence of opinion about the relationship between privacy rights and security concerns. Opinion polls reflect such a divided sentiment. A 2016 survey by Pew Research Center found that while 56% of survey participants want more to be done to keep the country safe, 52% remain seriously concerned about the scope of surveillance programs that may intrude upon their privacy, notably monitoring of internet search habits, email messages, and social media interactions.
Such privacy concerns extend beyond surveillance programs to corporate practices relating to the protection of personally identifiable information. The same Pew Research Center survey found that over 50% of respondents are concerned about the security of their personal data given the frequency and magnitude of data breaches in recent years.
There seems to be a sense of capitulation that in this digital age, privacy rights are destined to erode. 91% of Americans feel that they lost control over the collection, use, and disposition of their personal information.
These sentiments beg the question. What ought to be the relationship between privacy and security? At one end of the spectrum, there are those who argue that in the digital age, unless someone decides to live off the grid completely, “privacy no longer can mean anonymity”. The role of government and businesses is to institute proper measures to safeguard privacy rights.
Opponents of this view argue that in such a scenario, citizens are “expected to give up control of privacy to others, who – presumably – get to decide how much of it you deserve. That's what loss of liberty looks like.”
Yet another perspective is that there is an inevitable trade-off between privacy and security. Implicitly privacy rights are impacted when security is taken into consideration based on the premise that “you must surrender a little privacy if you want more security.”
Such a tension between privacy and security came to prominence when the Department of Justice sought for Apple to create a backdoor for the FBI to bypass iPhone encryption in order to access information that would potentially uncover activities of two terrorists who killed 14 people in San Bernardino. Apple, in their filing put forth a passionate defense for preserving privacy rights on both technical and legal grounds.
Apple’s position was reinforced by the Electronic Frontier Foundation arguing that “It would be great if we could make a backdoor that only the FBI could walk through. But that doesn’t exist. And literally every single mathematician, cryptographer, and computer scientist who’s looked at it has agreed.”
Perhaps one of the most thought-provoking positioned is posited by David S. Kris in his paper Digital Divergence. His main thesis is that with advances in digital network technology, privacy rights are harder to protect while security imperatives are more problematic to enforce as “digital network technology creates more private data of which less is relevant to security. All other things being equal, more private data is bad for privacy, but more irrelevant data—data pertaining to innocent persons—is bad for security because of the haystack effect.”
These arguments, notwithstanding privacy and security, may be approached as a continuum. Privacy rights cannot exist without security. In fact, privacy principles such as those found in most modern privacy legislation such as GDPR embed confidentiality, integrity, and availability of personally identifiable information as essential requirements for ensuring privacy compliance.
Organizations should explore technological innovations relating to privacy-enhancing technologies that strike a balance between privacy and security. Privacy by Design “seeks to accommodate all legitimate interests and objectives in a positive-sum “win-win” manner, not through a dated, zero-sum approach, where unnecessary trade-offs are made. Privacy by Design avoids the pretense of false dichotomies, such as privacy vs. security, demonstrating that it is possible to have both.”