Is not privacy the best AI privacy policy?
by James Brusseau, Director, Data Ethics Site, Pace University, New York
“You have zero privacy anyway,” Scott McNealy, CEO of Sun Microsystems scolded a concerned journalist in 1999, “Get over it.” He was wrong then. And if he said it again today, he would still be wrong. Tomorrow, though, is not so clear.
Most AI policy discussions about privacy concern strategies for its preservation, and cutouts for the relaxation of safeguards. So, we have policies banning facial recognition, and contact tracers that strip privacy in the name of fighting a pandemic.
No doubt these are the most important and currently pressing privacy issues, but they also hide the deeper and decisive question as technology increasingly pries into our lives: Should we be willing to accept full exposure, life without privacy?
No answer will be proposed here, but to ask the question seriously, we first need to get a sense of what the human experience would be if privacy were gone. That is the subject of this article, which begins about a decade before Apple computers went mainstream, in the 1970s with a philosophical thought experiment authored by Robert Nozick.
What he imagined was a was a floating tank of warm water, a sensory deprivation chamber where electrodes wrapped around our heads to feed our synapses a tantalizing experience indistinguishable from lived reality. You could choose your experiences beforehand – like choosing a book from the library – and then go into the tank.
Presumably, you could choose as many experiences as you would like and a single narrative could go on for years, decades. Nozick called this an Experience Machine, and the reason he formulated it was to ask a difficult hypothetical question: Would you permanently trade your outside life for an existence floating inside the tank?
If the answer is yes, every moment would be as thrilling or heroic or opulent as you could possibly wish. But, it would all only happen in your mind. It is a hard call. What is at stake, though, is easy to see: mental gratification versus personal freedom. You get prefabricated episodes guaranteed to feel good, exciting, or luxurious, at least in your mind. However, you lose the ability to create experiences and form a unique identity for yourself out in the unpredictable world.
A post-privacy reality offers an analogous choice: pleasure for freedom. It promises satisfactions, but requires relinquishing control over our own destinies. First, what does post-privacy mean? Private companies can purchase not only information about our age, location, recent credit card purchases and webpage visits, but also access all the text messages, telephone calls and pixels exchanged with every friend and romance.
There would be full access to medical records and work histories, and not just the topline statistics, but the history of every moment of every day as captured by video and audio surveillance. More could be added – all this is just the beginning of full exposure – but it is enough to begin sketching the pleasure or freedom dilemma.
As an initial and crude example, there are Netflix movies. By combing personal information about the viewer with predictive analytics, a film or episode is selected to begin rolling even before the previous one ends. The transition is almost seamless. Now, if Netflix knows everything about you – if there is no privacy – then it is probably going to be a good choice, one that holds your attention and wraps you in contentment as you keep watching.
It is also true, though, that your autonomy is getting limited. It is because you get what you want before making any choices: on one level, you don’t choose another movie from a list and then, above that, you don’t even choose whether to watch a movie at all because it’s already going. You are trapped in front of the screen. It’s a good trap because the confinement is your own data and entertainment, but it remains confinement.
One series that has been selected by the mechanisms of surveillance capitalism for many of the people reading this sentence is Black Mirror. There is an episode depicting a couple in a restaurant getting served their dishes just before asking to see the menu, and in a big data future of embraced transparency, that anticipatory provisioning should not be disconcerting but expected. Stronger, it is a central reason for relinquishing privacy and exposing ourselves to the uses and satisfactions of AI. It is good to get what you want before you want it.
Generalizing, it will not only be movie selections and the dinner choices. All our wants will be answered so immediately that we won’t even have time to understand why they are right, when we started wanting them, or to ask what it is that we wanted in the first place.
If the big data experience machine is functioning as it should – if it is instantly transforming complete personal information and predictive analytics into total consumer and user satisfactions – then we are not choosing anymore and, far more significantly, we’re not choosing to not choose. When we always already have what we want, no question about making a selection can even arise.
For that reason, one of the twisted curiosities about life after privacy is that the way we realize something is what we want is: we already have it. And that’s the only way we know that we want something. More, if we do feel the urge for a slice of pizza, or to binge Seinfeld, or to incite a romantic fling, what that really means is: we don’t actually want it. We can’t, since the core idea of the AI service economy is that it knows us transparently and so responds to our urges so perfectly that they’re answered without even a moment of suffering an unfulfilled craving.
It would be interesting to know whether something can be truly pleasurable if we get it before realizing a hunger for it, but no matter the answer, the drowning of personal desire in convenience and comfort is a significant temptation. On the other hand, there is no avoiding the cost that we no longer have personal freedom to make anything of ourselves.
Active self determination is gone since there is no room for experimenting with new possibilities, or for struggling with what’s worth pursuing and having. There are only the tranquil satisfactions that initially and powerfully recommended that we expose ourselves to big data reality and predictive analytics. Total exposure – reality without privacy – means a life so perfectly satisfying that we cannot do anything with it.
The unconditional surrender of our personal information is a big data Stockholm syndrome. Viewed from outside, users must seem enamored of the information sets and algorithms that control their experiences: those whose personal freedom has been arrested are actually grateful to their captors for the pleasures their minds experience.
But, from within the experience the very idea of being captive does not make sense since it is impossible to encounter – or even conceptualize – any kind of restraint. If we always get everything we want – movies, dinners, jobs, lovers, everything – before we even know we want them, how could we feel anything but total liberation?
Views expressed above belong to the author(s).
© 2019-23 AI Policy Exchange
AI Policy Exchange is a non-profit registered under section 8 of Indian Companies Act, 2013.
We are headquartered in New Delhi, India.
Founded in 2019 by Raj Shekhar and Antara Vats
editor@aipolicyexchange.org