Artificial intelligence in media, entertainment, and advertising promises euphoric convenience: perfect satisfaction for consumers who reveal their personal information. There’s a connection between that exchange and a famous philosophical thought experiment from the 1970s.
Authored by Robert Nozick, he proposed his Experience Machine experiment about a decade before Apple computers went mainstream. What Nozick imagined was a floating tank of warm water, a sensory deprivation chamber where electrodes wrapped around our heads to feed our synapses a tantalizing experience indistinguishable from lived reality. Would you trade, the thought experiment asked, your outside life for existence inside the tank, one that would be as thrilling or heroic or luxurious as you wish, but only in your mind? It’s a hard call.
What’s at stake, though, is easy to see. It’s hedonistic happiness versus personal freedom: you get prefabricated episodes guaranteed to feel good, while giving up the possibility of creating experiences and an identity for yourself out in the unpredictable world.
A post-privacy reality offers an analogous choice. It promises satisfactions but implies relinquishing control over our own destinies. As a crude example, Netflix movies intersect with predictive analytics to begin rolling the next film before the previous ends. If Netflix knows everything about you, then it’s probably going to be a good choice. But, your autonomy is truncated because you get what you want before making any choices at all: on one level, you don’t choose another movie from a list, and then above that, you don’t even choose whether to watch a movie at all because it’s already going.
One series that has been selected by the mechanisms of surveillance capitalism for many of the people reading this sentence is Black Mirror. There’s an episode depicting a couple in a restaurant getting served their dishes before asking to see the menu, and in a big data future of embraced transparency, that anticipatory provisioning shouldn’t be disconcerting but expected. Stronger, it’s a central reason for relinquishing privacy and exposing ourselves to the uses and satisfactions of the surveillance capitalists. Further, it won’t only be movie selections and dinner choices. All our wants will be answered so immediately that we won’t even have time to understand why they are right, when we started wanting them, or to ask what it is that we wanted in the first place. If the big data experience machine is functioning as it should to instantly transform complete personal information into total consumer and user satisfactions, then we’re not choosing anymore and, far more significantly, we’re not choosing to not choose.
For that reason, one of the twisted curiosities about life after privacy is that the way we realize something is what we want is: we already have it. And that’s the only way we know that we want something. More, if we do feel the urge for a slice of pizza, or to binge Seinfeld, or to incite a romantic fling, what that really means is: we don’t actually want it. We can’t, since the core idea of the big data service economy is that it knows us transparently and so responds to our desires so perfectly that they’re answered without even a moment of suffering an unfulfilled craving.
It would be interesting to know whether something truly satisfies if we get it before realizing a hunger for it, but no matter the answer, the drowning of personal desire in convenience and comfort is a significant temptation. And also one that implies a counter-intuitive exchange. We get personal authenticity, a life incarnating who we are, though it’s also true that we have no personal freedom to determine who that someone is. There can’t be freedom since there’s no room for experimenting with new possibilities, or for struggling with what’s worth pursuing and having. There’s only room for the tranquil satisfactions that initially recommended that we expose ourselves to big data reality and the comforts of finely tuned predictive analytics.
Personal transparency means a life so perfectly ours – so authentic – that we can’t do anything with it.
Viewed from outside, the unconditional surrender of our personal information is a big data Stockholm syndrome. Users seem enamored of the information sets and algorithms that control their experiences: those whose personal freedom has been arrested are actually grateful to their captors. But, from within the experience the very idea of being captive doesn’t make sense since it’s impossible to encounter – or even conceptualize – any kind of restraint. If we always get everything we want – movies, dinners, jobs, lovers, everything – before we even know we want them, how could we feel anything but total liberation?
Watch our webinars page for an upcoming webinar with James!