Shadow text and self-surveillance
Many years ago, I had an experience which will be familiar to most people. I think about it often, even though it took place some ten years ago. It concerns the Zach Braff film Garden State, although the film itself (thankfully) is unimportant. But I’ll come to that in a moment.
Essays begin by talking about an event in the past because all life-writing begins in this way: with a piece of time brought forth by power of memory and the written word to make sense of the present moment. These snapshots contain an embryo of the future which lay in wait, or at least, that’s the idea. We trace backwards, re-folding time to accommodate our best guess.
These events might resemble a point fixed on a graph which foretells action to come only after a subsequent collection of data. We move forward, blindly, adjusting our beliefs to make sense of things in retrospect. To the psychoanalyst, these discrete moments possess a sort of genetic structure, holding the key to subsequent behaviours, reactions and fixations. They are dredged back up from the murk of what we only half recall, only, increasingly, it is not memory alone, but digitally-enabled anamnesis which comes to our aid. This is the self-surveillance which captures our history online, holding it in place on a ‘Cloud’ which is no such thing: the opposite of immaterial, a vast hardware data-centre pulling in Megawatts of electricity to furnish our looking back. This is what I want to talk about.
My anecdote concerns something which I believe we lack a term for. Closely associated with ideas such as nostalgia and deja vu, it has not yet percolated down through our collective self-understanding into a mononymic term which gives a quick, easily understood shorthand. (Perhaps other languages have coined it already and I write in ignorance).
And so, Garden State. Forgive me. A crap film which I liked well enough at fifteen when my brother invited me along to the cinema. Some years later, I was chatting to a new friend at university when the film came up. I remarked that the soundtrack, at least, was quite good. She offered similar sentiments. We talked about the songs we both liked on it. We were both a little drunk. Then, a flash of recognition. I realised we’d had the exact same conversation a few months ago. Almost down to the individual sentences, our exchange was an exact replica of the former. It dawned on me that we’d been set on a loop, like an RPG video-game conversation where each character stands and waits to deliver a cued response. Life suddenly took on a mechanical or pneumatic atmosphere and I felt depressed for a while. (At that age, frankly, this happened a lot).
A few months ago, a friend and I were exchanging Whatsapp messages when something similar happened. I was gripped by an unsettling sense that we were retreading old ground, exchanging the same thoughts. Not talking so much as filling in the gaps on a pre-existing form.
This time around, however, I had the entire history of our conversations for the last five years at my grasp. I could, with ease, go back and compare: ctrl-f years and years of exchanged platitudes, questions and opinions. I was afraid to and so didn’t. I couldn’t bear to know if my fears were close to the mark, that our personalities are less unpredictable and our experiences less randomised than we like to imagine.
I crave a term for the moment of self-consciously discovering that we are prone to repeat ourselves given the right stimuli, like two perfectly equal spheres in a Newton’s cradle. A self-encounter with our ‘digital exhaust’ (to borrow a now out-of-favour term for our online data) which tells us more about who we are than we might like to know. Isn’t that cowardice? You might ask. To shy away from the truth of who you are? Aren’t the apps and tools which allow self-surveillance of our diet, exercise regime or leisure habits really there to help us – in the same way a mirror keeps us from looking unkempt and down-at-heel?
I’m probably ill-equipped to answer such questions, but that won’t keep me from trying.
To see yourself is never natural, although our present age is striving to make it so. We film one another covertly to try and catch our friends and family acting unselfconsciously. Watching back the footage is the closest you might come to seeing yourself as others see you, but this is an impossibility. Our reflections are always othered by our own self-awareness. We bring to our own image such a mess of ego complications that we can never really see ourselves fully. The field of psychology has spent over a century trying to allow people the luxury of understanding themselves better, as something of a conciliation.
So who then, faces us in our data trails. Is it our true self or a figure like the doppelganger and fetch? Our walking double who mirrors back the things we might not like to see? A former Google employee, the data scientist Seth Stephens-Davidowitz has written an entire book (Everybody Lies, 2017) dedicated to a fairly common sense observation, where self-examination is concerned: that people misreport information on themselves to present a better self-image. Our ego is so domineering that we insist on trying to look better than we are at every opportunity, but not in the search bar. Our true desires and instincts are known best by the people we should probably trust the least, the ‘Surveillance Capitalists’ whom Shoshana Zuboff takes to task in her much praised 2019 book.
The question of whether data acts as a mirror to our true self is a pertinent one at this moment in time. The first point to make is a simple one. Whilst the actions of light on a reflective surface can be said to be impartial, nothing could be further from the truth in the wholesale harvesting and processing of our personal data. The very act of collecting it is imbued with intent, designed for predictive models to help guide markets and advertising; Zuboff identifies this as a natural consequence of the ‘behavioral surplus’ which characterises online business. The more that is known about us, the more we can be manipulated through ‘choice architecture’ and well-timed nudges. The version of us that exists in what Zuboff calls the ‘shadow text’ of behind-the-scenes data aggregation is perhaps most like a doppelganger, only more pathetic, more like a clone without vital organs, kept alive in a tank, angel investor equity in its veins in place of blood.
The other point to make, with regard to data, mirrors and prediction, is what a mess of self-understanding we’ve been landed in, courtesy of the gradual creep of surveillance into our everyday lives. Data scientists pride themselves on getting better and better at creating the tools and machine learning programmes which capture our choices and decisions in order to predict the ones to come. Generally speaking, people are unmoved by this idea. When the product is free, you are the product, we say, either blithely or with a hint of darkling resignation. However, as Zuboff has made clear, this Information Age truism is lacking in truth. ‘We are not 'the product' of free service Tech behemoths,’ she points out, ‘we are the objects from which the raw predictive materials of their business models are extracted.’
Our existence, to the giants of tech, is a husk thrown away once the nutrient core is extracted. A pleasingly awful idea, but one which is hard to deny. We feed ourselves into the computer and they predict what we will do next. It’s a reversal of the old forms of divination, the ancient technocratic specialisms which were much in demand a few millennia ago. Back before the birth of Jesus Christ, the Haruspex was to the Ancient World what a full-stack developer is to the biggest blue chip companies trading today. They had offal and charts, we have Linux.
Scholars can’t be sure why the ancients looked to the entrails of animals, specifically, to divine the future, but it’s generally understood to have grown out of the initial practice of scapegoating and sacrifice. First they would kill the goat to pacify the gods, then they would inspect its liver to try and understand what had so aroused their ire to begin with. Today, we do the same thing via different means. The utopian tech visionary encourages us to scapegoat our past selves. The before and after photo is fast becoming the sine qua non of health and beauty marketing and social media narcissism. We are told to despise our former selves, to shun them in favour of the tech-empowered becoming that will soon arrive. Once the nasty business is done, we hand our bodies over to a third party who divines what it was we did, how we got there and why.
Big Tech hates the past, but not just our personal history, all that came before should be sacrificed in the name of forward momentum, fixes and the latest patch. As Shoshana Zuboff has meticulously pointed out, the maximally efficient future which Tech utopians dream of is descended in part from the ideas of B.F. Skinner and his much-feared magnum opus of behavioural psychology Beyond Freedom and Dignity (1971). Individuality, moral agency, and self-determination were vain ideas, in Skinner’s view, a relic of the Enlightenment which could be put to one side as a quasi-superstition now that hard science could be relied on to guide us. (Harvard’s graduate students in the mid-70s who worked alongside Skinner, Zuboff remarks, jokingly referred to the book as ‘Toward Slavery and Humiliation’).
It’s worth noting that we’ve been here before, to some extent. In the late nineteenth century, philosophers of science were dismayed by a growing confidence that all the natural laws of the universe were making themselves known in the laboratory. Suzanne Guerlac, in her book Thinking in time: An Introduction to Henri Bergson (2006), writes convincingly of the parallels between then and now. The horror with which writers such as Bergson treated these claims:
‘We can measure things, count them and make predictions about them because they are governed by logical and natural laws – the law of causality for example and the law of the conservation of matter. If, however, we extend scientific modes of thinking to ourselves, Bergson insisted, we would become like things. If we try to measure and count our feelings, to explain and predict our motives and actions, we will be transformed into automatons – without freedom, without beauty, without passion, and without dreams. We will become mere phantoms of ourselves.’
We know now that the self-satisfaction of scientific positivism had distinctly unwelcome consequences. The Great War and the rise of the Third Reich remain as a testament to how unspeakably awful things may become when logic and calculation are allowed to turn human life into a question of instrumentalization. By the 1930s, quantum mechanics and similar complications came to spoil the ‘solving’ of nature which had seemed inevitable to scientists at the end of the nineteenth century. Perhaps we await a similar shift in understanding today. To hear Silicon Valley CEOs talk about humanity and our existence, it’s quite obvious that they don’t even begin to understand it. Many of us devoutly wish, even without knowing it entirely, that we can sidestep any future calamities before the dawning of Surveillance Capitalism runs us off a similarly grotesque existential cliff edge.
Writers and thinkers such as Henri Bergson took refuge in old ideas in trying to formulate quite mystical arguments in support of autonomy and free will (that time is an energy, and can’t properly be understood after it’s recorded, to vulgarize his ideas). Jason Josephson Storm has written an entire book – The Myth of Disenchantment (2017) – to demonstrate that quote unquote serious thinkers have not shunned magic and mysticism, but instead embraced it whenever necessary. The resurgence of interest in astrology may make some groan, but it demonstrates a desire to reject the sort of cynical data worship that B.F. Skinner believed would cure us of the superstition of selfhood.
I crave a term for the moment of anxiety that awaits us in the recognition of our repetitive self-surveillance because language is one of the great refuges into which humanity coils itself when new threats emerge. To name the thing, as Zuboff also points out, is to tame it. To see the horror of how we are instrumentalizing ourselves, we need an apt terminology. Siri and Alexa and Cortana cannot replace the human ear and brain because listening is about more than collecting information. The psychoanalyst listens to us, intimately, in the hope that our willing, lengthy divulgences will make patterns known to us. Our one-sided conversation serves the therapeutic function of helping us see how we repeat mistakes, get locked into loops of morbidity and go round and round. Surveillance Capitalists desire the same one-sided conversation, because they wish to keep us locked in those same loops; submerged on an unconscious holding pattern where our impulses go unchecked and, if anything, are actively encouraged. Handed to the highest bidder to boost sales and turn our mental anguish into Paypal gold.
Naming the experience. I can’t think of anything good to call it, so let’s call it ‘unrecognition’ until something better comes along. This person who plays out a life in the Shadow Text of digital exhaust is supposed to be you, but they couldn’t possibly be recognised in any meaningful sense. For the simple reason that they are built from A/B decisions and the segmentation of data points, like a hyper-Cubist image, or a CGI figure comprised of a complex lattice of lines and dots. They are your double, seen through the eyes of a data scientist possessing no deeper understanding of human life than statistical probabilities and unenlightened info-accumulation.
Towards the end of The Age of Surveillance Capitalism, Shoshana Zuboff spends time discussing the idea of ‘the Hive’ which social media creates for its users. She describes it as a digital connectedness which leaves us wired in to an online community at all times, forever looking to it for reassurance or merely the comfort of being heard. She fears for the damaging effect such wiring-in will have for today’s adolescents and what it will do to their personal development and moral individuation. Zuboff speaks admiringly of the philosopher Gaston Bachelard, whose book The Poetics of Space (1958) explored the role that our homes play in helping us to think, dream and feel at peace. The place of sanctuary, Zuboff argues, is crucial for our wellbeing: we need more of such spaces, free from the surveillance of Big Tech and the herding motivations of social media, in order to be fully ourselves in the world.
I believe something similar in the case of self-surveillance. We need a place of refuge from the doppelganger who dwells in the online Shadow Text. If the reality of our desires is best known to Tech companies, then the creation of that doppelganger, knowledge of its existence, is only likely to do us harm. In Slavoj Zizek’s book on Lacan – Looking Awry (1991) – he talks about ‘the real of our desire’ in Lacanian terms, something we can only encounter in dreams, when our deepest and darkest wishes come true:
‘our common everyday reality, the reality of the social universe in which we assume our usual roles of kind-hearted, decent people, turns out to be an illusion that rests on a certain “repression,” on overlooking the real of our desire. This social reality is then nothing but a fragile symbolic cobweb that can at any moment be torn aside by an intrusion of the real.’
Perhaps it is cowardly to wish to hide from what we are, to disavow our own potential monstrosity, our selfishness, neediness and petty fixations. But hiding from such things, arguably, allows civilisation to continue. Maintaining social cohesion for a while longer. If a dark future awaits us, in all senses, then we don’t need the pain of accumulating unseen shadows, harvested by self-styled visionaries who wish to ‘solve’ humanity.
Tom Duggins is a loose conglomeration of words and ideas. He has written for The Guardian, Little White Lies, Vice UK and The Quietus amongst others. You can find him on Twitter @duggins_tom.