It’s increasingly hard to write about the future, mostly due to its plural, consistently amorphous nature. The best we can do is to come up with a well-informed guess or speculative invention. Hypothetical scenarios can help us plan for events that could happen if the conditions are right, but there’s a strong argument for not speculating too far or adopting too radical a political imagination – sometimes extreme dystopias and utopias don’t help address the now.
Future Politics falls in the dead centre in terms of the possible world it describes. There are no flying cars in Susskind’s wranglings, and they’re all the better for it. There’s plenty of subtle, pervasive advocacy of blockchain – which I won’t go into for fear of publicly hand-wringing about this contemporary emperor’s new clothes – but, in general, the text reads as deeply rational and measured, offering careful advice on how to view emerging technological anxieties. In confronting the power that tech firms are rapidly accumulating, for instance, Susskind calls for two kinds of regulation: structural (“ensuring the technologies of power don’t become too concentrated in the hands of a small number of firms and individuals”) and transparent, which compels tech firms to prioritise legibility and simplicity in how data and algorithms are used. Problems with technological consent and access are clearly broken down (“necessary consent is not really consent at all”) and our inability to remove ourselves from abusive systems is made clear: “I didn’t choose to be the subject of surveillance[…] I’m given no choice but to engage with [surveillance] technology.”
Recent years have seen publishers bring out a number of books on technology and politics – from James Bridle’s New Dark Age: Technology and the End of the Future (Verso, 2018), and Nick Srnicek and Alex Williams’s Inventing the Future: Postcapitalism and a World Without Work (Verso, 2015), to Adam Greenfield’s Radical Technologies: The Design of Everyday Life (Verso, 2017) and Jaron Lanier’s Ten Arguments For Deleting Your Social Media Accounts Right Now (Bodley Head, 2018). It’s hard not to see a common denominator here, and it’s not just Verso Books. But Susskind’s work is something of an anomaly amongst this profusion. It doesn’t make any particularly bold or inflammatory statements, such as Lanier’s claim that “We’re all lab animals now”, and it doesn’t put forward a revolutionary perspective that promises to unbutton all we have ever known about capitalism, politics and society (“fully automated luxury communism” anyone?). What Susskind offers instead is good common sense. I hesitate to say common sense as if it’s something ultimately objective, because that presupposes that there’s a constancy or continuity in knowledge around these matters – which there isn’t, otherwise there wouldn’t be so many books. Reading Future Politics is like going on a really good, but not spectacular, date. You have decent conversation, your companion is pleasant and has good table manners (or politics, in Susskind’s case), but you don’t quite get that spark. You’d highly recommend them to a friend, however.
The way Susskind unfolds arguments comes from a clear understanding of the law and political theory, as well as a familiarity with the myriad ideas around emerging technologies that are starting to haunt our everyday encounters with these structures. Susskind leads the reader through often complex issues, such as prejudice and bias in technology, emphasising, for example, the role that “data-based injustice” has to play, where “no matter how smart an algorithm is, if it is fed a partial or misleading view of the world it will treat unjustly those who have been hidden from view or presented in an unfair light.” Previous claims of data having an inherent neutrality are refuted (something Susskind calls “the neutrality fallacy”) and the author is also explicit and fair in directing attention to those affected by particular shifts in power between technology producers and state entities, both positively (the tech giants and their shareholders) and negatively (those from marginalised groups). Each idea, concept or theory is rationally explored, and laid out with patience and care.
So why does this feel so unremarkable? Perhaps the issue is that we have convinced ourselves of the need for radical alternatives to our contemporary political reality – of positions to place ourselves behind and directions to aspire to. But we also need sound, solid advice that is orientated towards dealing with what is immediately in front of us. What is so uninspiring about practical advice when it’s perhaps what we most need?
The law is complex and open to interpretation and exploitation; the same is even more true of political theory. Susskind’s clarity in explaining these concepts is astounding, but his book nevertheless lacks an acknowledgement that humans don’t do particularly well with the rational. From the socio-political (examples include flat-earth conspiracy theorists and the internet manhunt of Sunil Tripathi, falsely accused of being the Boston Marathon bomber) to the personal (knowing you shouldn’t give that guy a second chance), being sensible is a matter of opinion. I have my own rationality threshold that differs from everyone else’s, and how I feel about something may depend on a number of factors – from major news events, grief or heartbreak, through to whether I’ve had a coffee or not. It’s safe to say that rationality carries a heavy degree of nuance and subjectivity.
Our decision-making is often driven by something messier. Frequently, we are guided more by emotion and perception than by logical conclusion. There’s not necessarily anything wrong with this – in its most positive incarnation, it is what gives us the conviction to think beyond what is currently possible – but in its worst manifestations this form of motivation stokes the violent discourse that has led to the recent resurgence of xenophobia and aggravated nationalism around the world. Logically, for instance, migration is not a nationwide cause for alarm in the UK. As reported by the Office for National Statistics, in the year 2017-2018 net long-term international migration to the UK rose to 273,000, less than 0.5 per cent of the total population. But to those who feel that their livelihoods or nationhood are at risk, there’s nonetheless cause to find trouble in expert (and therefore elite) opinion. As Susskind draws out, perception-control is a key form of exerting power over people – “to control what they know, what they think, and what they are prepared to say about the world”. The presentation of facts and figures – of data – is arguably one of the most contested battlegrounds of the last 50 years, with data visualisation showing how the same information can often be presented in conflicting ways. For a resonant example, see how infographics designer Simon Scarr and data analyst Andy Cotgreave were able to use the same facts and figures to produce infographics respectively titled ‘Iraq’s Bloody Toll’ and ‘Iraq: Deaths on the Decline’.
Data itself, as Susskind and many others who study the subject will attest, has always been an important means to control, judge and influence. As Lisa Gitelman and Virginia Jackson write in their introduction to “Raw Data” is an Oxymoron, “Data need to be imagined as data to exist and function as such.” Essentially, we decide what data is, and we make a subject of it. Data is not a priori knowledge and it does not exist as an ultimate, objective truth. The conclusions we find in data are entirely contextual and have to work within a system that has assigned truths that may conflict with our own. Susskind puts it this way: “Anyone who has ever dealt with the tax authorities, the educational system or any other complex bureaucracy knows that the truth hardly matters. What’s written on your form is far more important.”
Data is subjective and experiential, but although intangible, it is certainly embodied. Sun-ha Hong, in his 2015 essay ‘Presence or the Sense of Being-There and Being-With in the New Media Society’, writes: “I am told my personal data is being exploited, but I do not quite ‘feel’ it.” In our dealings with information, Hong puts forward the conception of a trace-body – a “phenomenological connection” between the visceral and the digital, “composed entirely of data”, which can also be seen as the “data double” or “data doppelgänger”. In French, it is “ombre numérique”, your digital shadow. The trace-body has social and economic value, although you are not clear what that value is until it is systematically exploited. But a trace-body is not felt, as Hong continues: “the life and usage of the trace-body remains severed from our affective processes.”
In spite of this, it may now be important to think about bodies with and alongside technology a little differently. Our sense of embodiment has shifted in the digital age and is (mostly) detached from the immediately corporeal: we are not slapped or pinched by data, but rather it is transformed into something tied to our lived experience of the world, in the places where bodies can or cannot go due to a misalignment of information. In 2012, the statistician Andrew Pole told New York Times journalist Charles Duhigg about how his work at department store Target had extended to using data to determining whether customers were pregnant: “We knew that if we could identify them in their second trimester, there’s a good chance we could capture them for years,” said Pole. “As soon as we get them buying diapers from us, they’re going to start buying everything else too.” The brand’s subsequent use of targeted advertising and coupons for baby products, however, ended up inadvertently revealing the pregnancy of a teenage girl to her father, severely limiting her autonomy. Her search terms for advice had become subject to the opaque recommendation algorithms to which Target submits its users’ data. Similarly, people of colour, whose faces are not “read” with the same accuracy by facial recognition software as white people’s – due to the fact that many of the training databases utilised by the technology (along with those who programme it) are not very diverse – still feel the effects of data when facial recognition is used to identify supposed offenders. For instance, during the 2017 Notting Hill Carnival one person was incorrectly detained as a result of the technology – an incident where misidentification led to real, physical invasion. We still feel horror when information about us is abused, or denies us what we think we deserve – from insurance inequalities resulting from where you live, through to access to social services and support.
As Susskind mentions throughout Future Politics, a version of you has already been written into code: “our actions, utterances, movements, relationships, emotions and beliefs will leave a permanent or semi-permanent digital mark.” But what the book does not explore to any great degree is the individual, or even collective, emotional response to the way data affects us, which drives so much of our political vitriol. I was recently teaching a group of students who told me about the feeling of “shoeburyness” (as coined by the writer Douglas Adams): the uncomfortable feeling of sitting on a chair (or toilet seat) previously warmed by someone else. I like to think of that feeling when thinking about our lives with data and our deeply textured, irrational (although frequently self-represented as rational) responses to its misappropriation. That discomfort and sense of abstracted presence may be key to understanding our reactions to these changes – a necessary response to a perpetual condition of uncertainty and not knowing what will come next.
Each misperceived data point – read in error or applied irresponsibly – has a person attached to it. Technological efficiency and application, when not properly thought through or naively applied, can have serious consequences. As Safiya Umoja Noble writes in her brilliant Algorithms of Oppression: “The implications of such marginalisation are profound. The insights about sexist and racist biases[…] are important because information organizations, from libraries to schools and universities to governmental agencies, are increasingly reliant on being displaced by a variety of web-based ‘tools’ as if there are no political, social, or economic consequences of doing so.” Susskind’s work is useful in understanding what potential theoretical frameworks we might consider in order to help prepare for exponential change in technology and politics, with the introduction of concepts such as data democracy, under which some decisions are taken “on the basis of data rather than votes”. But while he does clearly outline who will most be affected by such changes, we still need a more embodied view. Rationality is difficult to accept when rationales have not been very kind to you. One of the most interesting recent arguments against making technology less prejudiced is that technologies such as facial recognition are often used as a means of exercising power against people of colour. There is a form of safety in illegibility within the current biased technologies, the thinking goes, which protects communities from more efficient violence being enacted upon them. Why facilitate the stop and search of black teenagers when the police and other state forces have shown distinct, observable, racial bias? To be seen, in this instance, is not to be recognised, but to become a better target.
Future Politics gives recommendations, rather than manifestos, and is clear in underlining that it will take a long time before we see real change. It refreshingly veers away from the kind of alarmist rhetoric (think Lanier’s lab-rat comparison) that often stymies the nuanced understanding of complex matters. But while Susskind’s book shows sobriety and restraint in the face of chaos, this is perhaps why it failed to resonate with me. A turn towards emotional and embodied reactions, consequences, and the possibility of changing entrenched power dynamics could give us a greater steer on what to prioritise when it comes to thinking about the future. The personal is political, as feminist Carol Hanisch once wrote, and this will not change as technology becomes ever more embedded in our daily lives. Whoever has the power will decide what constitutes a person to be governed, and what constitutes a body to be digitised very much depends on where you are standing.