How the Internet Turned Sour: Jon Rafman and the Closing of the Digital Frontier

This essay was first published by IM1776 on 17th August 2021

A tumble-drier is dragged out into someone’s garden and filled with something heavy — a brick perhaps. After setting it spinning, a figure in a camouflage jacket and protective face visor retreats from the camera frame. Immediately the machine begins to shudder violently, and soon disintegrates as parts fly off onto the surrounding lawn. 

This is the opening shot of Mainsqueeze, a 2014 video collage by the Canadian artist Jon Rafman. What comes after is no less unsettling: a young woman holds a small shellfish, stroking it affectionately, before placing it on the ground and crushing it slowly under her heel; an amateur bodybuilder, muscles straining grotesquely, splits a watermelon between his thighs. 

Rafman, concerned about the social and existential impact of technology on contemporary life, discovered these and many other strange performances while obsessively trawling the subaltern corners of the internet — communities of trolls, pranksters and fetishists. The artist’s aim, however, isn’t to ridicule these characters as freaks: to the contrary, he maintains: “The more marginal, the more ephemeral the culture is, the more fleeting the object is… the more it can actually reflect and reveal ‘culture at large.’” What looks at first like a glimpse into the perverse fringes, is really meant to be a portrait of online culture in general: a fragmented world of niche identities and uneasy escapism, where humor and pleasure carry undercurrents of aggression and despair. With such an abundance of stimulation, it’s difficult to say where satisfaction ends and enslavement begins.

Even as we joke about the pathologies of online life, we often lose sight of the depressing arc the internet revolution has followed during the past decade. It’s impossible to know exactly what lies behind the playful tone of Twitter and the carefree images of Instagram, but judging by the personal stories we hear, there’s no shortage of addiction (to social media, porn, smartphones), identity crisis, and anxiety about being judged or exposed. It seems much of our online existence is now characterized by the same sense of hyper-alert boredom, claustrophobia and social estrangement that Rafman found at the margins of the internet years ago.

Indeed, the destructive impulses of Rafman’s trolls seem almost quaint by comparison to the shaming and malicious gossip we take for granted on social media. And whereas a plurality of outlooks and personalities was once the glory of the internet, today every conceivable subject, from art and sports to haircuts, food, and knitting, is reified as a divisive issue within a vast political metanarrative.

In somewhat of an ironic twist, last year, Rafman himself was dropped or suspended by numerous galleriesfollowing accusations of inappropriate sexual behavior, leveled through the anonymous Instagram account Surviving the Artworld (which publishes allegations of abusive behavior in the art industry). The accusers say they felt taken advantage of by the artist; Rafman insists that there was a misunderstanding. It’s always hard to know what to make of such cases, but that social media now serves as a mechanism for this kind of summary justice seems symptomatic of the social disintegration portrayed in works like Mainsqueeze.

Even if these accusations mark the end of Rafman’s career, his efforts to document online culture now seem more valuable than ever. His art gives us a way of thinking about the internet and its discontents that goes beyond manipulative social media algorithms, ideological debasement or the culture wars. The artist’s work shows the evolution of the virtual realm above all as a new chapter of human experience, seeking to represent the structures of feeling that made this world so enticing and, ultimately, troubled.

The first video by Rafman I came across reminded me of Swift’s Gulliver’s Travels. Begun in 2008, the visionary Kool-Aid Man in Second Life consists of a series of tours through the virtual world platform Second Life, where users have designed a phantasmagorical array of settings in which their avatars can lead, as the name suggests, another life. In the video, our guide is Rafman’s own avatar, the famous Kool-Aid advertising mascot (a jug of red liquid with the weird rictus grin) — a protagonist that reminds us we’ve entered an era where, as Rafman puts it, “different symbols float around equally and free from the weight of history.” For the entire duration, Kool-Aid Man wanders around aimlessly in a surreal, artificial universe, sauntering in magical forests and across empty plains, through run-down cityscapes and futuristic metropolises, placidly observing nightclub dance floors, ancient temples, and the endless stages where the denizens of Second Life perform their sexual fantasies.

Kool-Aid Man in Second Life is best viewed against the backdrop of the great migration onto the internet which started in the mid-2000s, facilitated by emerging tech giants like Amazon, Google and Facebook. For the great majority of people, this was when the internet ceased being merely a toolbox for particular tasks and became part of everyday life (the art world jargon for this was ‘post-internet’). The artwork can be seen as a celebration of the curiosity, fun, and boundless sense of possibility that accompanied this transition. Humanity was stepping en-masse out of the limits of physical space, and what it found was both trivial and sublime: a kitsch world of selfies and cute animal as well as effortless new forms of association and access to knowledge. The euphoric smile of Kool-Aid Man speaks to the birth of online mass culture as an innocent adventure.

Similar themes appear also in Rafman’s more famous (and ongoing) early work The Nine Eyes of Google Street View, in which the artist collects peculiar images captured by Google Maps’ vehicles. Scenes include a magnificent stag bounding down a coastal highway, a clown stepping into a minibus, a lone woman breastfeeding her child in a desolate landscape of dilapidated buildings. As in Rafman’s treatment of Second Life, such eclectic scenes are juxtaposed to portray the internet as an emotional voyage of discovery, marked by novel combinations of empathy and detachment, sincerity and irony, humour and desire. But in hindsight, no less striking than the spirit of wonder in these works are the ways they seem to anticipate the unravelling of online culture. 

If there’s something ominous about the ornate dream palaces of Second Life, it comes from our intuition that the stimulation and belonging offered by this virtual community is also a measure of alienation. The internet gives us relations with people and things that have the detached simplicity of a game, which only become more appealing as we find niches offering social participation and identity. But inevitably, these ersatz lives become a form of compulsive retreat from the difficulties of the wider world and a source of personal and social tension. Rafman’s Second Life is a vivid metaphor for how virtual experience tempts us with the prospect of a weightless existence, one that can’t possibly be realised and must, ultimately, lead to resentment. 

Equally prescient was Rafman’s emphasis on the breakdown of meaning, as words, images, and symbols of all kinds become unmoored from any stable context. Today, all ‘content’ presents itself much like the serendipitous scenes in The Nine-Eyes of Google Street View – an arbitrary jumble of trivial and profound, comic and tragic, impressions stripped of semantic coherence and flattened into passing flickers of stimulation. Symbols are no longer held firm in their meaning by clearly defined contexts where we might expect to find them, but can be endlessly mixed and refashioned in the course of online communication. This has been a great source of creativity, most obviously in the form of memes, but it has also produced neurosis. Today’s widespread sensitivity to the alleged violence concealed in language and representation, and the resulting desire to police expression, seems to reflect deep anxiety about a world where nothing has fixed significance. 

These more ominous trends dominate the next phase of Rafman’s work, where we find pieces like Mainsqueeze. Here Rafman plunges us into the sordid underworld of the internet, a carnival of adolescent rebellion and perverse obsessions. A sequence of images showing a group of people passed-out drunk, one with the word “LOSER” scrawled on his forehead, captures the overall tone. In contrast to Rafman’s Second Life, where the diversity of the virtual realm could be encompassed by a single explorer, we now find insular and inaccessible communities, apparently basking in an angry sense of estrangement from the mainstream of culture. Their various transgressive gestures — swastikas, illicit porn, garish make-up — seem tinted with desperation, as though they’re more about finding boundaries than breaking them.

This portrayal of troll culture has some unsettling resonances with the boredom and anxiety of internet life today. According to Rafman himself, however, the wider relevance of these outcasts concerns their inability to confront the forces shaping their frustrated existence. Trapped in a numbing cycle of distraction, their subversive energy is channelled into escapist rituals rather than any kind of meaningful criticism of the society they seem to resent. Seen from this perspective, online life comes to resemble a form of unknowing servitude, a captive state unable to grasp the conditions of its own deprivation.

All of this points to the broader context which is always dimly present in Rafman’s work: the architecture of the virtual world itself through which Silicon Valley facilitated the great migration onto the internet over the past fifteen-odd year. In this respect, Rafman’s documentation of Second Life becomes even more interesting, since that platform really belonged to the pre-social media Cyberpunk era, which would make it a eulogy for the utopian ethos of the early internet, with its dreams of transcending the clutches of centralised authority. The power that would crush those dreams is represented, of course, by Rafman’s Google Street View’s car — the outrider of big tech on its endless mission to capitalise on all the information it can gather.

But how does this looming corporate presence relates to the disintegration of online culture traced by Rafman? The artist’s comments about misdirected critical potential suggest one depressing possibility: the internet is a power structure which sustains itself through our distraction, addiction and alienation. We might think of Huxley’s Brave New World, but with shitposting and doom-scrolling instead of the pleasure-drug soma. Rafman’s most recent animation work, Disaster under the Sun, seems to underscore this dystopian picture. We are given a God’s-eye perspective over a featureless grey landscape, where crowds of faceless human forms attack and merge into one another, their activities as frantic and vicious as they are lacking any apparent purpose. 

It’s certainly true that the internet giants have gained immense wealth and power while overseeing the profound social and political dislocations of the last decade. But it’s also true that there are limits to how far they can benefit from anarchy. This, might explain why we are now seeing the emergence of something like a formal constitutional structure to govern the internet’s most popular platforms, such as with Facebook, whose Oversight Board now even provides a court of appeal for its users — but also Twitter, Google, and now PayPal. The consolidation of centralized authority over the internet resembles the closing of a frontier, as a once-lawless space of discovery, chaos and potential is settled and brought under official control. 

Rafmans’ work allows us to grasp how this process of closure has also been a cultural and psychological one. We have seen how, in his art, the boundlessness of the virtual realm, and our freedom within it, are portrayed not just as a source of wonder but also of disorientation and insecurity. There have been plenty of indications that these feelings of flux have made people anxious to impose order, whether in the imagined form of conspiracy theories or by trying to enforce new norms and moral codes.

This isn’t to say that growing regulation will relax the tensions that have overtaken online culture. Given the divergence of identities and worldviews illustrated by Rafman’s depiction of the marginal internet, it seems highly unlikely that official authority can be impartial; drawing boundaries will involve taking sides and identifying who must be considered subversive. But all of this just emphasises that the revolutionary first chapter of internet life is drawing to a close. For better or worse, the particular spirit of discovery that marked the crossing of this frontier will never return.

Tooze and the Tragedy of the Left

Adam Tooze is one of the most impressive public intellectuals of our time. No other writer has the Columbia historian’s skill for laying bare the political, economic and financial sinews that tie together the modern world.

Tooze’s new book, Shutdown: How Covid Shook the World’s Economy, provides everything his readers have come to expect: a densely woven, relentlessly analytical narrative that uncovers the inner workings of a great crisis – in this case, the global crisis sparked by the Covid pandemic in 2020.

But Shutdown provides something else, too. It shows with unusual clarity that, for all his dry detachment and attention to detail, Tooze’s view of history is rooted in a deep sense of tragedy.

Towards the end of the book, Tooze reflects on the escalating “polycrisis” of the 21st century – overlapping political, economic and environmental conflagrations:

In an earlier period of history this sort of diagnosis might have been coupled with a forecast of revolution. If anything is unrealistic today, that prediction surely is. Indeed, radical reform is a stretch. The year 2020 was not a moment of victory for the left. The chief countervailing force to the escalation of global tension in political, economic, and ecological realms is therefore crisis management on an ever-larger scale, crisis-driven and ad hoc. … It is the choice between the third- and fourth-best options.

This seems at first typical of Tooze’s hard-nosed realism. He has long presented readers with a world shaped by “crisis management on an ever-larger scale.” Most of his work focuses on what, in Shutdown, he calls “functional elites” – small networks of technocratic professionals wielding enormous levers of power, whether in the Chinese Communist Party or among the bureaucrats and bankers of the global financial system.

These authorities, Tooze emphasises, are unable or unwilling to reform the dynamics of “heedless global growth” which keep plunging the world into crisis. But their ability to act in moments of extreme danger – the ability of the US Federal Reserve, for instance, to calm financial markets by buying assets at a rate of $1 million per second, as it did in March last year – is increasingly our last line of defence against catastrophe. The success or failure of these crisis managers is the difference between our third- and fourth-best options.

But when Tooze notes that radical change would have been thinkable “in an earlier period of history,” it is not without pathos. It calls to mind a historical moment that looms large in Tooze’s work. 

That moment is the market revolution of the 1980s, the birth of neoliberalism. For Tooze, this did not just bring about an economic order based on privatisation, the free movement of goods and capital, the destruction of organised labour and the dramatic growth of finance.

More fundamentally, neoliberalism was about what Tooze calls “depoliticisation.” As the west’s governing elites were overtaken by dogmas about market efficiency, the threat of inflation and the dangers of government borrowing, they hard-wired these principles into the framework of globalisation. Consequently, an entire spectrum of possibilities concerning how wealth and power might be distributed were closed-off to democratic politics. 

And so the inequalities created by the neoliberal order became, as Tony Blair said of globalisation, as inevitable as the seasons. Or in Thatcher’s more famous formulation, There Is No Alternative.

Tooze’s view of the present exists in the shadow of this earlier failure; it is haunted by what might have been. As he bitterly observes in Shutdown, it might appear that governments have suddenly discovered the joys of limitless spending, but this is only because the political forces that once made them nervous about doing so – most notably, a labour movement driving inflation through wage demands – have long since been “eviscerated.”

But it seems to me that Tooze’s tragic worldview reveals a trap facing the left today. It raises the question: what does it mean to accept, or merely to suspect, that radical change is off the table? 

We glimpse an answer of sorts when Tooze writes about how 2020 vindicated his own political movement, the environmentalist left. The pandemic, he claims, showed that huge state intervention against climate change and inequality is not just necessary, but possible. With all the talk of “Building Back Better” and “Green Deals,” centrist governments appear to be getting the message. Even Wall Street is “learning to love green capitalism.”

Of course, as per the tragic formula, Tooze does not imagine this development will be as transformative as advertised. A green revolution from the centre will likely be directed towards a conservative goal: “Everything must change so that everything remains the same.” The climate agenda, in other words, is being co-opted by a mutating neoliberalism. 

But if we follow the thrust of Tooze’s analysis, it’s difficult to avoid the conclusion that realistic progressives should embrace this third-best option. Given the implausibility of a genuine “antisystemic challenge” – and in light of the fragile systems of global capitalism, geopolitics and ecology which are now in play – it seems the best we can hope for is enlightened leadership by “functional elites.”

This may well be the true. But I think the price of this bargain will be higher than Tooze acknowledges. 

Whether it be climate, state investment, or piecemeal commitments to social justice, the guardians of the status quo have not accepted the left’s diagnosis simply because they realise change is now unavoidable. Rather, these policies are appealing because, with all their moral and existential urgency, they can provide fresh justification for the unaccountable power that will continue to be wielded by corporate, financial and bureaucratic interests. 

In other words, now that the free-market nostrums of neoliberalism 1.0 are truly shot, it is the left’s narratives of crisis that will offer a new basis for depoliticisation – another way of saying There Is No Alternative.

And therein lies the really perverse tragedy for a thinker like Tooze. If he believes the choice is survival on these terms or not at all, then he will have to agree.

The Fall of Zuma Threatens More Chaos for South Africa

This article was originally published by Unherd on 1st July 2021

It was a moment South Africans thought would never come. On Tuesday the Constitutional Court sentenced former president Jacob Zuma to 15 months in prison, after he refused to testify at an inquiry into corruption during his time in office.

When that inquiry reaches its conclusion, Zuma could face a much longer sentence — an amazing prospect. For now though, the simple willingness of the court to punish such blatant recalcitrance offers tantalising hope that the rule of law is not dead in South Africa.

The verdict was surprising given that Zuma still commands a significant power base in the ruling African National Congress party. The eye-watering levels of graft that marked his 2009-18 presidency means there are plenty of ANC figures at every level of government who want the anti-corruption drive of his successor, Cyril Ramaphosa, to fail.

And therein lies the more ominous question posed by Tuesday’s ruling. Even if Zuma hands himself over to the authorities as instructed, he won’t do it quietly. So could this lead to an escalation of the already murderous internal politics of the ANC – an all-out civil war within the party that drags the nation into the abyss?

The Zuma presidency was a waking nightmare for those of us who prayed that, after its miraculously peaceful transition from apartheid to democracy, South Africa’s governing elite would resist the slide into gangsterism which has squandered the potential of so many African nations. This was always a danger with the ANC because, being the party of Mandela and the heroic anti-apartheid struggle, it was destined to rule virtually unopposed during the first decades of democracy.

Zuma’s infamous Nkandla homestead in KwaZulu-Natal, for which he fleeced the public purse to the tune of £14 million, offers a flavour of his regime’s conspicuous venality. More serious was his gutting of the criminal justice system, paving the way for the kind of corruption that would make a hardened kleptocrat blush. At the current inquiry, witnesses have lined up to detail how Zuma effectively handed control of much of the state to a notorious trio of shady businessmen known as Gupta brothers. Apparently these cronies installed government ministers, siphoned money from state-owned companies and cashed-in on lucrative contracts. Prosecutors claim as much as £50 billion was swindled from state coffers.

With the ANC having lost ground in recent elections, Ramaphosa’s campaign to clean up the party might be a sign of democratic pressures finally kicking in. More cynically, we might note that the president needs to purge Zuma’s faction to consolidate his own leadership. At any rate, Ramaphosa knows corruption has to be addressed if South Africa is to attract the investors it sorely needs. Youth unemployment stands at a grim 75%, while millions of its citizens have only the most rudimentary housing and sanitation. Its tax base continues to shrink as wealthier citizens flee appalling levels of violent crime.

By insisting that Zuma be subject to the law, the Constitutional Court’s latest ruling suggests a positive outcome to this saga is still possible. But it remains far from clear what direction the ANC’s internal struggle will take —  and ultimately, it’s this struggle that will determine the country’s future.

Disaster Junkies

We live in an era where catastrophe looms large in the political imagination. On the one side, we find hellacious visions of climate crisis and ecological collapse; on the other, grim warnings of social disintegration through plummeting birth rates, mass immigration and crime. Popular culture’s vivid post-apocalyptic worlds, from Cormac McCarthy’s The Road to Margaret Atwood’s Handmaid’s Tale, increasingly echo in political discourse – most memorably in Donald Trump’s 2016 inauguration speech on the theme of “American Carnage.” For more imaginative doom-mongers there are various technological dystopias to contemplate, whether AI run amok, a digital surveillance state, or simply the replacement of physical experience with virtual surrogates. Then in 2020, with the eruption of a global pandemic, catastrophe crossed from the silver screen to the news studio, as much of the world sat transfixed by a profusion of statistics, graphs and harrowing reports of sickness and death.

If you are anything like me, the role of catastrophe in politics and culture raises endless fascinating questions. How should we explain our visceral revulsion at fellow citizens dying en mass from an infectious disease, and our contrasting apathy to other forms of large-scale suffering and death? Why can we be terrified by climate change without necessarily feeling a commensurate urgency to do something about it? Why do certain political tribes obsess over certain disasters?

It was questions like these that led me to pick up Niall Ferguson’s new book, Doom: The Politics of Catastrophe. I did this somewhat nervously, it must be said. I found one of Ferguson’s previous books extremely boring, and tend to cringe at his use of intellectual gimmicks – like his idea that the past success of Western civilisation can be attributed to six “killer apps.” Then again, Ferguson’s contrarianism does occasionally produce an interesting perspective, such as his willingness to weigh the negative aspects of the British Empire against the positive, as historians do with most other empires. But as I say, it was really the subject of this latest book that drew me in.

I might as well say upfront that I found it very disappointing. This is going to be a bad review – though hopefully not a pointless one. The flaws of this book can, I think, point us towards a richer understanding of catastrophe than Ferguson himself offers.

Firstly, Doom is not really about “the politics of catastrophe” as I understand that phrase. A few promising questions posed in the introduction – “Why do some societies and states respond to catastrophe so much better than others? Why do some fall apart, most hold together, and a few emerge stronger? Why does politics sometimes cause catastrophe?” – are not addressed in any sustained way. What this book is really about is the difficulty of predicting and mitigating statistically irregular events which cause excess deaths. That sounds interesting enough, to be sure, but there’s just one fundamental problem: Ferguson never gets to grips with what actually makes such events catastrophic, leaving a rather large hole where the subject of the book should be. 

The alarm bells start ringing when Ferguson introduces the book as “a general history of catastrophe” and, in case we didn’t grasp how capacious that sounds, tells us it will include:

not just pandemics but all kinds of disasters, from the geological (earthquakes) to the geopolitical (wars), from the biological (pandemics) to the technological (nuclear accidents). Asteroid strikes, volcanic eruptions, extreme weather events, famines, catastrophic accidents, depressions, revolutions, wars, and genocides: all life – and much death – is here.

You may be asking if there is really much of a relationship, throughout all the ages of history, between asteroid strikes, nuclear accidents and revolutions – and I’d say this gets to a pretty basic problem with tackling a subject like this. Writing about catastrophe (or disaster – the two are used a synonyms) requires finding a way to coherently group together the extremely diverse phenomena that might fall into this category. It requires, in other words, developing an understanding of what catastrophe actually means, in a way that allows for useful parallels between its different manifestations. 

Ferguson seems to acknowledge this when he rounds off his list by asking “For how else are we to see our disaster [i.e. Covid] – any disaster – in proper perspective?” Yet his concept of catastrophe turns out to be circular, inconsistent and inadequate. Whatever aspect of catastrophe Ferguson happens to be discussing in a particular chapter becomes, temporarily, his definition of catastrophe as such. When he is talking about mortality, mortality becomes definitive of catastrophe (“disaster, in the sense of excess mortality, can take diverse forms and yet pose similar challenges”). Likewise when he is showing how infrequent and therefore hard to predict catastrophes are (“the rare, large scale disasters that are the subject of this book”). In Ferguson’s chapter seeking similarities between smaller and larger disasters, he seems happy to simply accept whatever is viewed as a disaster in the popular memory: the Titanic, Chernobyl, the failed launch of NASA’s Challenger spacecraft. 

This is not nitpicking. I’m not expecting the metaphysical rigor of Immanuel Kant. I like an ambitious, wide-ranging discussion, even if that means sacrificing some depth. But attempting this without any real thesis, or even a firm conceptual framework, risks descending into a series of aimless and confusing digressions which don’t add up to anything. And that is more or less what happens in this book.

Consider Ferguson’s chapter on “The Psychology of Political Incompetence.” After a plodding and not especially relevant summary of Tolstoy’s concluding essay in War and Peace, Ferguson briefly introduces the idea that political leaders’ power is curtailed by the bureaucratic structures they inhabit. He then cuts to a discussion of the role of ideology in creating disastrous food shortages, by way of supporting Amartya Sen’s argument that democratic regimes respond better to famines than non-democratic ones. It’s not clear how this relates to the theme of bureaucracy and leadership, but this is one of the few sections where Ferguson is actually addressing something like “the politics of catastrophe;” and when he poses the interesting question of “why Sen’s theory does not apply to all forms of disaster” it feels like we are finally getting somewhere.

Alas, as tends to be the case in this book, Ferguson doesn’t answer the question, but embarks on a series of impromptu arguments against straw men. A winding discussion of British ineptness during the two World Wars brings him to the conclusion that “Democracy may insure a country against famine; it clearly does not insure against military disaster.” Who said that it does? Then Ferguson has suddenly returned to the issue of individual leadership, arguing that “it makes little sense” to hold Churchill solely responsible for the fall of Singapore to the Japanese in 1942. Again, who said we should? Ferguson then rounds off the chapter with an almost insultingly cursory discussion of “How Empires Fall,” cramming eight empires into less than five pages, to make the highly speculative argument that that imperial collapse is as unpredictable as various other kinds of disaster.

Insofar as anything holds this book together, it is the thin sinews of statistical probability models and network science. These do furnish a few worthwhile insights. Many of the events Ferguson classes as disasters follow power-law distributions, which is to say there is no regular relationship between their scale and the frequency with which they occur. So big disasters are essentially impossible to predict. In many cases, this is because they emerge from complex systems – natural, economic and social – which can unexpectedly amplify small events into enormous ones. In hindsight, these often seem to have been entirely predictable, and the Cassandras who warned of them are vindicated. But a regime that listened to every Cassandra would incur significant political costs in preparing for disasters that usually won’t materialize.

I also liked Ferguson’s observation that the key factor determining the scale of a disaster, in terms of mortality, is “whether or not there is contagion – that is, some way of propagating the initial shock through the biological networks of life or the social networks of humanity.” But his other useful comments about networks come in a single paragraph, and can be quoted without much further explanation:

If Cassandras had higher centrality [in the network], they might be more often heeded. If erroneous doctrines [i.e. misinformation] spread virally through a large social network, effective mitigation of disaster becomes much harder. Finally… hierarchical structures such as states exist principally because, while inferior to distributed networks when it comes to innovation, they are superior when it comes to defence.

I’m not sure it was necessary to have slogged through an entire chapter on network science, recycled from Ferguson’s last book, The Square and the Tower, to understand these points.

But returning to my main criticism, statistical and network analysis doesn’t really allow for meaningful parallels between different kinds of catastrophe. This is already evident in the introduction, when Ferguson states that “disaster takes too many forms for us to process with conventional approaches to risk mitigation. No sooner have we focused our minds on the threat of Salafi jihad than we find ourselves in a financial crisis originating in subprime mortgages.” As this strange comment suggests, the implied perspective of the book is that of a single government agency tasked with predicting everything from financial crises and terrorist attacks to volcanic eruptions and genocides. But no such agency exists, of course, for the simple reason that when you zoom in from lines plotted on a graph, the illusion that these risks are similar dissolves into a range of totally different phenomena attached to various concrete situations. The problem is absurdly illustrated when, having cited a statistical analysis of 315 conflicts between 1820-1950, Ferguson declares that in terms of predictability, “wars do indeed resemble pandemics and earthquakes. We cannot know in advance when or where a specific event will strike, nor on what scale.” Which makes it sound like we simply have no way of knowing whether the next conflict is more likely to break out in Gaza or Switzerland.  

In any case, there is something patently inadequate about measuring catastrophe in terms of mortality figures and QALYs (quality-adjusted life years), as though the only thing we have in common is a desire to live for as long as possible. Not once is the destruction of culture or ways of life mentioned in the book, despite the fact that throughout history these forms of loss have loomed large in people’s sense of catastrophe. Ferguson even mentions several times that the most prolific causes of mortality are often not recognised as catastrophes – but does not seem to grasp the corollary that catastrophe is about something more than large numbers of deaths. 

Indeed, maybe the best thing that can be said about Doom is that its shortcomings help us to realise what does need to be included in an understanding of catastrophe. Throughout the book, we see such missing dimensions flicker briefly into view. In his discussion of the flu pandemic of the late 1950s, Ferguson notes in passing that the Soviet launch of the Sputnik satellite in October 1957 “may help to explain why the memory of the Asian flu has faded” in the United States. This chimes with various other hints that this pandemic was not really perceived as a catastrophe. But why? And it what sense was it competing with the Cold War in the popular imagination? Likewise, Ferguson mentions that during the 1930s the lawyer Basil O’Connor used “the latest techniques in advertising and fundraising” to turn the “horrific but relatively rare disease” of polio into “the most feared affliction of the age.” This episode is briefly contrasted to the virtual silence of the American media and political class over AIDS during the 1980s. 

In fact, unacknowledged catastrophes are an unacknowledged theme of the book. It re-emerges in several intriguing mentions of the opioid epidemic in the United States, with its associated “deaths of despair.” At the same time as there was “obsessive discussion” of global warming among the American elite, Ferguson points out, “the chance of dying from an overdose was two hundred times greater than the chance of being killed by a cataclysmic storm.” He also describes the opioid crisis as “the biggest disaster of the Obama presidency,” and suggests that although “the media assigned almost no blame to Obama” for it, “such social trends did much to explain Donald J. Trump’s success.” Finally, Ferguson notes that during the current Covid crisis, the relative importance of protecting the vulnerable from the disease versus maintaining economic activity became an active front in the American culture war. 

The obvious implication of all this is that, while Ferguson does not really engage with “the politics of catastrophe,” the concept and reality of catastrophe is inherently political. There isn’t really an objective measure of catastrophe: the concept implies judging the nature and consequences of an event to be tragic. Whether or not something meets this standard often depends on who it affects and whether it fits into the emotionally compelling narratives of the day. The AIDS and opioid epidemics initially went unrecognized because their victims were homosexuals and working class people respectively. To take another example, the 1921 pogrom against the affluent African American community in Tulsa, Oklahoma, was for the longest time barely known about, let alone mourned (except of course by African Americans themselves); yet a hundred years later it is being widely recognised as a travesty. Last week’s volcanic eruption in the Democratic Republic of Congo, which may have left 20,000 people homeless, would probably be acknowledged as catastrophic by a Westerner who happened to read about it in the news. But we are much more likely to be aware of, and emotionally invested in, the disastrous Israeli-Palestinian conflict of recent weeks. 

Catastrophe, in other words, is inextricably bound up with popular perception and imagination. It is rooted in the emotions of fear, anger, sadness, horror and titillation with which certain events are experienced, remembered or anticipated. This is how we can make sense of apathy to the late-1950s flu pandemic: such hazards, as Ferguson mentions, were still considered a normal part of life rather than an exceptional danger, and people’s minds were focused on the potential escalation of the Cold War. Hence also the importance of the media in determining whether and how disasters become embedded in public discourse. While every culture has its religious and mythical visions of catastrophe (a few are mentioned in a typically fleeting discussion near the start of Doom), today Netflix and the news media have turned us into disaster junkies, giving form and content to our apocalyptic impulses. The Covid pandemic has been a fully mediated experience, an epic rollercoaster of the imagination, its personal and social significance shaped by a constant drumbeat of new information. It is because climate change cannot be made to fit this urgent tempo that is has been cast in stead as a source of fatalism and dread, always looming on the horizon and inspiring millions with a sense of terrified helplessness.  

Overlooking the central role of such cultural and political narratives probably meant that Ferguson’s Doom was doomed from the start. For one thing, this missing perspective immediately shows the problem with trying to compare catastrophes across all human history. Yes, there are fascinating patterns even at this scale, like the tendency of extreme ideological movements to emerge in the midst of disasters – whether the flagellant orders that sprang from the 14th century Black Death, or the spread of Bolshevism in the latter part of the First World War. But to really understand any catastrophe, we have to know what it meant to the people living through it, and this means looking at the particulars of culture, politics and religion which vary enormously between epochs. This, I would argue, is why Ferguson’s attempt to compare the Athenian plague of the late 5th century BC to the Black Death in medieval England feels rather superficial. 

And whatever the historical scope, statistics simply don’t get close to the imaginative essence of catastrophe. Whether or not a disaster actually happens is incidental to its significance in our lives; many go unnoticed, others transform culture through mere anticipation. Nor do we experience catastrophes as an aggregate of death-fearing individuals. We do so as social beings whose concerns are much more elaborate and interesting than mere life and death.

How the Celebs Rule Us

Who should we call the first “Instagram billionaire”? It’s a mark of the new Gilded Age we’ve entered that both women vying for that title belong to the same family, the illustrious Kardashian-Jenner clan. In 2019, it looked like Kylie Jenner had passed the ten-figure mark, only for Forbes to revise its estimates, declaring that Jenner had juiced her net worth with “white lies, omissions and outright fabrications.” (Her real wealth, the magazine thought, was a paltry $900 million). So, as of April this year, the accolade belongs to Jenner’s no less enterprising sister, Kim Kardashian West.

Social media has ushered in a new fusion of celebrity worship and celebrity entrepreneurship, giving rise to an elite class of “influencers” like Jenner and Kardashian West. Reality TV stars who were, in that wonderful phrase, “famous for being famous,” they now rely on their vast social media followings to market advertising space and fashion and beauty products. As such, they are closely entwined with another freshly minted elite, the tech oligarchs whose platforms are the crucial instruments of celebrity today. Word has it the good people at Instagram are all too happy to offer special treatment to the likes of the Kardashians, Justin Bieber, Taylor Swift and Lady Gaga – not to mention His Holiness the Supreme Pontiff of the Universal Church (that’s @franciscus to you and me). And there’s every reason for social media companies to accommodate their glamorous accomplices: in 2018, Jenner managed to wipe $1.3 billion off the market value of Snapchat with a single tweet questioning the platform’s popularity. 

It’s perfectly obvious, of course, what hides behind the embarrassingly thin figleaf of “influence,” and that is power. Not just financial power but social status, cultural clout and, on the tech companies’ side of the bargain, access to the eyeballs and data of huge audiences. The interesting question is where this power ultimately stems from. The form of capital being harvested is human attention; but how does the tech/influencer elite monopolise this attention? One well-known answer is through the addictive algorithms and user interfaces that turn us into slaves of our own brain chemistry; another invokes those dynamics of social rivalry, identified by the philosopher René Girard, whereby we look to others to tell us what we should want. 

But I think there’s a further factor here which needs to be explored, and it begins with the idea of charisma. In a recent piece for Tablet magazine, I argued that social media had given rise to a new kind of charismatic political leader, examples of which include Donald Trump, Jeremy Corbyn, Jordan Peterson and Greta Thunberg. My contention was that the charisma of these individuals, so evident in the intense devotion of their followers, does not stem from any innate quality of their personalities. In stead, charisma is assigned to them by online communities which, in the process of rallying around a leader, galvanise themselves into political movements.

Here I was drawing on the great German sociologist Max Weber, whose concept of “charismatic authority” describes how groups of people find coherence and structure by recognising certain individuals as special. And yet, the political leaders I discussed in the Tablet piece are far from the only examples showing the relevance of Weber’s ideas today. If anything, they are interlopers: accidental beneficiaries of a media system that is calibrated for a different type of charismatic figure, pursuing a different kind of power. I’m referring, of course, to the Kardashians, Biebers, and countless lesser “influencers” of this world. It is the twin elite of celebrities and tech giants, not the leaders of political movements, who have designed the template of charismatic authority in the social media age. 

When Weber talks about charismatic authority, he is talking about the emotional and ideological inspiration we find in other people. We are compelled to emulate or follow those individuals who issue us with a “calling” – a desire to lead our lives a certain way or aspire towards a certain ideal. To take an obvious example, think about the way members of a cult are often transfixed by a leader, dropping everything in their lives to enter his or her service; some of you will recall the scarlet-clad followers of the guru Bhagwan Shree Rajneesh in the 2018 Netflix documentary Wild Wild Country. Weber’s key observation is that this intensely subjective experience is always part of a wider social process: the “calling” of charisma, though it feels like an intimate connection with an exceptional person, is really the calling of our own urge to fit in, to grasp an identity, to find purpose and belonging. There’s a reason charismatic figures attract followers, plural. They are charismatic because they represent a social phenomenon we want to be a part of, or an aspiration our social context has made appealing. Whatever Rajneesh’s personal qualities, his cult was only possible thanks to the appeal of New Age philosophy and collectivist ways of life to a certain kind of disillusioned Westerner during the 1960s and ’70s. 

Today there’s no shortage of Rajneesh-like figures preaching homespun doctrines to enraptured audiences on Youtube. But in modern societies, charismatic authority really belongs to the domain of celebrity culture; the domain, that is, of the passionate, irrational, mass-scale worship of stars. Since the youth movements of the 1950s and 60s, when burgeoning media industries gave the baby-boomers icons like James Dean and The Beatles, the charismatic figures who inspire entire subcultures and generations have mostly come from cinema and television screens, from sports leagues, music videos and fashion magazines. Cast your mind back to your own teenage years – the time when our need for role models is most pressing – and recall where you and your chums turned for your wardrobe choices, haircuts and values. To the worlds of politics and business, perhaps? Not likely. We may not be so easily star-struck as adults, but I’d vouch most of your transformative encounters with charisma still come, if not from Hollywood and Vogue, then from figures projected into your imagination via the media apparatus of mass culture. It’s no coincidence that when a politician does gain a following through personality and image, we borrow clichés from the entertainment industry, whether hailing Barack Obama’s “movie star charisma” or dubbing Yanis Varoufakis “Greece’s rock-star finance minister.”

Celebrity charisma relies on a peculiar suspension of disbelief. We can take profound inspiration from characters in films, and on some level we know that the stars presented to us in the media (or now presenting themselves through social media) are barely less fictional. They are personae designed to harness the binding force of charismatic authority – to embody movements and cultural trends that people want to be part of. In the context of the media and entertainment business, their role is essentially to commodify the uncommodifiable, to turn our search for meaning and identity into a source of profit. Indeed, the celebrity culture of recent decades grew from the bosom of huge media conglomerates, who found that the saturation of culture by new media technologies allowed them to turn a small number of stars into prodigious brands.

In the 1980s performers like Michael Jackson and Madonna, along with sports icons like Michael Jordan, joined Hollywood actors in a class of mega celebrities. By the ’90s, such ubiquitous figures were flanked by stars catering to all kinds of specific audiences: in the UK, for instance, lad culture had premiership footballers, popular feminism had Sex and the City, Britpoppers had the Gallagher brothers and grungers had Kurt Cobain. For their corporate handlers, high-profile celebrities ensured revenues from merchandise, management rights and advertising deals, as well as reliable consumer audiences that offset the risks of more speculative ventures.

Long before social media, in other words, celebrity culture had become a thoroughly commercialised form of charismatic authority. It still relied on the ability of stars to issue their followers with a “calling” – to embody popular ideals and galvanise movements – but these roles and relationships were reflected in various economic transactions. Most obviously, where a celebrity became a figurehead for a particular subculture, people might express their membership of that subculture by buying stuff the celebrity advertised. But no less important, in hindsight, was the commodification of celebrities’ private lives, as audiences were bonded to their stars through an endless stream of “just like us” paparazzi shots, advertising campaigns, exclusive interviews and documentaries, and so on. As show-business sought to the maximise the value of star power, the personae of celebrities were increasingly constructed in the mould of “real” people with human, all-too-human lives.

Which brings us back to our influencer friends. For all its claims to have opened up arts and entertainment to the masses, social media really represents another step towards a celebrity culture dominated by an elite cluster of stars. Digital tech, as we know, has annihilated older business models in media-related industries. This has concentrated even more success in the hands of the few who can command attention and drive cultural trends – who can be “influencers” – through the commodification of their personal lives. And that, of course, is exactly what platforms like Instagram are designed for. A Bloomberg report describes how the Kardashians took over and ramped-up the trends of earlier decades:

Back in the 1990s, when the paparazzi were in their pomp, pictures of celebrities going about their daily lives… could fetch $15,000 a pop from tabloids and magazines… The publications would in turn sell advertising space alongside those images and rake in a hefty profit.

Thanks to social media, the Kardashians were able to cut out the middle man. Instagram let the family post images that they controlled and allowed them to essentially sell their own advertising space to brands… The upshot is that Kardashian West can make $1 million per sponsored post, while paparazzi now earn just $5 to $10 apiece for “Just Like Us” snaps.

Obviously, Instagram does not “let” the Kardashians do this out of the kindness of its heart: as platforms compete for users, it’s in their interests to accommodate the individuals who secure the largest audiences. In fact, through their efforts to identify and promote such celebrities, the social media companies are increasingly important in actually making them celebrities, effectively deciding who among the aspiring masses gets a shot at fame. Thus another report details how TikTok “assigned individual managers to thousands of stars to help with everything, whether tech support or college tuition,” while carefully coordinating with said stars to make their content go viral.

But recall, again, that the power of celebrities ultimately rests on their followers’ feeling that they’re part of something – that is the essence of their charisma. And it’s here that social media really has been revolutionary. It has allowed followers to become active communities, fused by constant communication with each other and with the stars themselves. Instagram posts revealing what some celeb had for breakfast fuel a vast web of interactions, through which their fans sustain a lively sense of group identity. Naturally, this being social media, the clearest sign of such bonding is the willingness of fans to group together like a swarm of hornets and attack anyone who criticises their idols. Hence the notorious aggression of the “Beleibers,” or fanatical Justin Bieber fans (apparently not even controllable by the pop star himself); and hence Instagram rewriting an algorithm to protect Taylor Swift from a wave of snake emojis launched by Kim Kardashian followers. This, surely, is the sinister meaning behind an e-commerce executive bragging to Forbes magazine about Kylie Jenner’s following, “No other influencer has ever gotten to the volume or had the rabid fans” that she does. 

In other words, the celebrity/tech elite’s power is rooted in new forms of association and identification made possible by the internet. It’s worth taking a closer look at one act which has revealed this in an especially vivid way: the K-Pop boy band BTS (the name stands for Bangtan Sonyeodan, or Beyond the Scene in English). Preppy outfits and feline good looks notwithstanding, these guys are no lightweights. Never mind the chart-topping singles, the stadium concerts and the collaborations with Ed Sheeran; their success registers on a macroeconomic scale. According to 2018 estimates from the Hyundai Research Institute, BTS contributes $3.6 billion annually to the South Korean economy, and is responsible for around 7% of tourism to the country. No less impressive are the band’s figures for online consumption: it has racked up the most YouTube views in a 24-hour period, and an unprecedented 750,000 paying viewers for a live-streamed concert. 

Those last stats are the most suggestive, because BTS’s popularity rests on a fanatical online community of followers, the “Adorable Representative M.C. for Youth” (ARMY), literally numbering in the tens of millions. In certain respects, the ARMY doesn’t resemble a fan club so much as an uncontacted tribe in the rainforest: it has its own aesthetics, norms and rituals centred around worship of BTS. All that’s missing, perhaps, is a cosmology, but the band’s management is working on that. It orchestrates something called the “Bangtan Universe”: an ongoing fictional metanarrative about BTS, unfolding across multiple forms of media, which essentially encourages the ARMY to inhabit its own alternate reality. 

Consequently, such is the ARMY’s commitment that its members take personal responsibility for BTS’s commercial success. They are obsessive about boosting the band’s chart performance, streaming new content as frequently and on as many devices as possible. The Wall Street Journal describes one fan’s devotion:  

When [the BTS song] “Dynamite” launched, Michelle Tack, 47, a cosmetics stores manager from Chicopee, Massachusetts, requested a day off work to stream the music video on YouTube. “I streamed all day,” Tack says. She made sure to watch other clips on the platform in between her streaming so that her views would count toward the grand total of views. […]

“It feels like I’m part of this family that wants BTS to succeed, and we want to do everything we can do to help them,” says Tack. She says BTS has made her life “more fulfilled” and brought her closer to her two daughters, 12 and 14. 

The pay-off came last October, when the band’s management company, Big Hit Entertainment, went public, making one of the most successful debuts in the history of the South Korean stock market. And so the sense of belonging which captivated that retail manager from Massachussetts now underpins the value of financial assets traded by banks, insurance companies and investment funds. Needless to say, members of the ARMY were clamouring to buy the band’s shares too. 

It is this paradigm of charismatic authority – the virtual community bound by devotion to a celebrity figurehead – which has been echoed in politics in recent years. Most conspicuously, Donald Trump’s political project shared many features with the new celebrity culture. The parallels between Trump and a figure like Kylie Jenner are obvious, from building a personal brand off the back of reality TV fame to exaggerating his wealth and recognising the innovative potential of social media. Meanwhile, the immersive fiction of the Bangtan Universe looks like a striking precedent for the wacky world of Deep State conspiracy theories inhabited by diehard Trump supporters, which spilled dramatically into view with the Washington Capitol invasion of January 6th.

As I argued in my Tablet essay – and as the chaos and inefficacy of the Trump presidency demonstrates – this social media-based form of charismatic politics is not very well suited to wielding formal power. In part, this is because the model is better suited to the kinds of power sought by celebrities: financial enrichment and cultural influence. The immersive character of online communities, which tend to develop their own private languages and preoccupations, carries no real downside for the celebrity: it just means more strongly identified fans. It is, however, a major liability in politics. The leaders elevated by such movements aren’t necessarily effective politicians to begin with, and they struggle to broaden their appeal due to the uncompromising agendas their supporters foist on them. We saw these problems not just with Trump movement but also with the Jeremy Corbyn phenomenon in the UK, and, to an extent, with the younger college-educated liberals who influenced Bernie Sanders after 2016. 

But this doesn’t mean online celebrity culture has had no political impact. Even if virtual communities aren’t much good at practical politics, they are extremely good at producing new narratives and norms, whether rightwing conspiracy theories in the QAnon mould, or the progressive ideas about gender and identity which Angela Nagle has aptly dubbed “Tumblr liberalism.” Celebrities are key to the process whereby such innovations are exported into the wider discourse as politically-charged memes. Thus Moya Lothian Mclean has described how influencers popularise feminist narratives – first taking ideas from academics and activists, then simplifying them for mass consumption and “regurgitat[ing] them via an aesthetically pleasing Instagram tile.” Once such memes reach a certain level of popularity, the really big celebrities will pick them up as part of their efforts to present a compelling personality to their followers (which is not to say, of course, that they don’t also believe in them). The line from Tumblr liberalism through Instagram feminism eventually arrives at the various celebrities who have revealed non-binary gender identities to their followers in recent years. Celebs also play an important role in legitimising grassroots political movements: last year BTS joined countless other famous figures in publicly giving money to Black Lives Matter, their $1 million donation being matched by their fans in little more than a day.

No celebrity can single-handedly move the needle of public opinion, but discourse is increasingly shaped by activists borrowing the tools of the influencer, and by influencers borrowing the language of the activist. Such charismatic figures are the most important nodes in the sprawling network of online communities that constitutes popular culture today; and through their attempts to foster an intimate connection with their followers, they provide a channel through which the political can be made to feel personal. This doesn’t quite amount to a “celebocracy,” but nor can we fully understand the nature of power today without acknowledging the authority of stars.

The Charismatic Politics of Social Media

This essay was originally published by Tablet Magazine on 21st April 2021.

In the wake of Donald Trump’s presidency, the tone of politics has become much quieter, and not just in the United States. It’s amazing how much room this man’s personality took up in the public conversation. But we should remember that what silenced Trump was not losing an election in November 2020. It was being kicked off social media after his supporters stormed the Capitol on Jan. 6.

The decision to take away Trump’s megaphone was the natural outcome of a phenomenon that emerged around 2015, when politics was transformed by a new type of charismatic leader, unique to our own era, who emerged from a culture increasingly centered around social media platforms like Facebook, Twitter, Instagram, and YouTube. But Trump is just one example, albeit a dramatic one. On the left there is also Sen. Bernie Sanders and Rep. Alexandria Ocasio-Cortez, as well as Jeremy Corbyn, the former leader of the Opposition in the United Kingdom. There is the teenage climate activist Greta Thunberg and the cult philosopher Jordan Peterson. These men and women “went viral,” their individual charisma spread by a new, decentralized media system, and they galvanized movements that defined themselves as fighting against the established order.

Some of these figures’ time in the limelight is already over. But others will take their place, because the forces that gave rise to them are still here. To understand their appeal, we only have to turn to the influential German sociologist of the early 20th century, Max Weber. It was Weber who popularized “charisma” as a political term. And it is Weber’s concept of charismatic leadership that seems more relevant now than ever before.

Born 157 years ago tomorrow, Weber lived at a time when Western societies, and Germany especially, were being transformed by industrialization at a frantic pace. The central aim of his work was to understand how modern societies evolved and functioned in contrast to those of the past. Hailed as a brilliant young intellectual, Weber suffered a nervous breakdown around the turn of the 20th century, and subsequently produced a gloomy account of the modern world that was to be his greatest legacy. In The Protestant Ethic and the Spirit of Capitalism, published in 1905, he argued that the foundation of modernity was an ultrarational approach to organizing our lives and institutions, especially in pursuit of profit—a culture he compared to an “iron cage.”

It is against this backdrop that we find Weber’s most famous ideas about charismatic leadership. There was, he observed, a weak point in the iron cage of rationality. The modern principle that the right to govern comes from the people created an opening for charismatic politicians to gain immense power by winning the adoration of the masses. In his influential 1919 lecture Politics as a Vocation, Weber suggested the best example of this was the 19th-century British politician William Gladstone. But after Weber’s death in 1920, his theory of charismatic leadership achieved new renown, as it seemed to predict the dictatorships of Mussolini, Hitler, and Stalin.

A century later, Weber’s vision of “dictatorship resting on the exploitation of mass emotionality” fits nicely into the current moment, and may even have fed the reflexive portrayal of Trump as some sort of proto-fascist ruler. But in fact, this understanding of political charisma as purely a tool of modern demagogues is a misreading of Weber’s ideas.

Weber believed that charismatic individuals shape the politics of every era. A charismatic leader, he wrote in the posthumously published Economy and Society, has “a certain quality of an individual personality, by virtue of which he is set apart from ordinary men and treated as endowed with supernatural, superhuman, or at least specifically exceptional powers or qualities.” For Weber, the crucial element is to understand that charisma has a social function. He didn’t see charisma merely as a character trait belonging solely to the leader. He saw the desire to follow charismatic individuals as a necessary ingredient that binds groups of people together. Hence, when he laid out the three forms of authority that organize all societies, he included “charismatic authority” alongside legal structures and tradition.

What’s more, this mutually binding power of charisma doesn’t only sustain societies, according to Weber—it also transforms them. He actually thought the purest example of charismatic authority came from religious movements led by prophets, of the kind that shaped the history of Judaism, Christianity, and Islam. Here Weber describes charisma as a “revolutionary force,” because of the way prophets unite their followers with a sense of confidence and conviction that can shatter existing structures of authority. Charisma is like a spark that ignites sweeping social and cultural change.

This is the Weberian insight that opens the door to understanding the charismatic leaders of our own time. To grasp what makes an individual charismatic, we shouldn’t just focus on their personality: We should look at the people who are brought together by their mutual recognition of a leader.

Today, the social basis for much political ideology and activism comes from online subcultures, where people develop common worldviews based on spontaneous and widely shared feelings, like the sense of being betrayed by corrupt elites. It is from these virtual communities that political movements emerge, often by discovering and adopting a charismatic figure that galvanizes them. Through the rapid circulation of video clips and social media posts, an individual can be turned into a leader almost overnight.

What is remarkable about this paradigm is how much the standard relationship between leaders and followers has been reversed: These new movements are not created by their leaders, even though the leaders may command tremendous devotion. The followers “choose” their leader. The movements exist first in an unrealized form, and conjure up leaders that allow them to fully manifest and mobilize themselves.

Weber spoke of charisma being “recognized,” emphasizing the way leaders inspire their followers with a sense of purpose or spiritual “calling.” People gravitate toward individuals who give them a language to express their shared feelings and an example to follow. But what matters most is that, through this collective recognition of a figurehead, the followers cement their own social bond.

When we look at the charismatic leaders who have emerged in recent years, we don’t in fact see authoritarian figures who control their movements and bend their followers to their own distinct political visions. What we see are leaders who rise suddenly and unexpectedly, and whose actual beliefs are less important than their ability to embody the emotions that unite their devotees. Today it is the leaders who are shaped by the attitudes of their movements rather than the other way around.

Thus, Trump’s followers were never all that interested in how effectively he turned campaign slogans into reality. What held the MAGA movement together was not the content of Trump’s rather inconsistent and half-hearted declarations about policy, but the irreverent drama of rebellion that he enacted through the political theater of his rallies and Twitter posts. His leadership gave birth to intense internet communities, where diehard supporters cooked up their own narratives about his struggle against the establishment.

The point isn’t that Trump had no real power over his followers, which of course he did. The point is that his power depended on—and was limited to—the role of culture war icon that his movement created for him. Trump was effective in this role because he had no apparent strategy apart from giving his audience what it wanted, whether photo-ops brandishing a Bible, or nods and winks at online conspiracy theories.

Likewise, Sanders and Corbyn were both old men who unexpectedly found themselves riding tidal waves of youthful support. But their sudden rise from relative obscurity led to some awkward moments when some of their more strongly held views did not align with the wishes of their followers. Sanders’ campaign for president changed significantly from 2016 to 2020, as the mass movement that chose him as its leader molded him into a champion of their immigration preferences, which he had previously opposed. Similarly, in his time as leader of the British Labour Party from 2015 to 2020, Corbyn had to abandon his lifelong opposition to the European Union because he was now leading a movement that cherished EU membership as one of its core values.

Finally, consider two cases from outside the realm of official politics. Greta Thunberg is treated as a modern saint who has inspired millions to march through the world’s cities demanding action against climate change. But Thunberg’s enormous presence in the environmental movement is not matched by a unique philosophy or any organizational power. She went viral on social media during her 2018 strike outside the Swedish parliament, and her fame now rests on being invited by political and religious leaders to shout at them on camera about how her generation has been betrayed. “I understand that people are impressed by this movement,” Thunberg told the Times in 2019, “and I am also very impressed with the young people, but I haven’t really done anything. I have just sat down.”

Then there’s Canadian psychologist Jordan Peterson. Thanks to a few viral videos about free speech in 2016 and a series of controversial media engagements thereafter, Peterson went from teaching Christological interpretations of Disney films to being hailed as the messiah of the anti-woke movement. Peterson has continually stressed that he’s interested in psychology, not politics, yet what followers find captivating are his filmed broadsides against social justice ideology, which have been viewed millions of times on YouTube.

All these figures have been imbued with a certain magical status, which deepens the shared identity of their followers. Movements have gathered around them as totems embodying a fight against injustice and a spirit of revolt. Consequently, they command strong emotional attachments, though their followers are only interested in them insofar as they stay within the limits of the movement they were chosen to lead. The power of their charisma depends, therefore, on conforming to parameters set by the imagination of their followers.

Obviously, individual personality is not irrelevant here. Charismatic figures are generally regarded as authentic, based on the perception that they are not trying to meet social expectations or simply advance their careers. Seen in this way, it makes sense that a generation accustomed to the shifting trends and constant self-promotion of social media would warm to old-timers like Sanders and Corbyn, who had been stoically banging the same drum for decades.

Interestingly, both Trump and Thunberg have often had their personalities pathologized by critics: Trump on account of his “narcissistic personality disorder,” Thunberg on account of her autism and single-minded commitment to her cause. But supporters see these same qualities as refreshingly direct. This kind of appeal is necessary for leaders who want to offer their followers the personal “calling” which Weber saw as key to charisma. No one is inspired to take on the establishment by people who look and sound like they belong to it.

Nonetheless, following Weber’s lead, we don’t need to think about charisma as something that’s simply inherent to these influential personalities. In the sudden explosion of hype surrounding certain figures on social media, we see how the conviction that an individual is special can be created through collective affirmation. This is the virtual equivalent of the electrifying rallies and demonstrations where followers have gathered to see figures like Trump, Corbyn, and Thunberg: The energy is focused on the leader, but it comes from the crowd.

So what does all this tell us about the future of the new charismatic movement politics? Weber insisted that to exercise real power, charismatic authority cannot keep relying on the spiritual calling of committed followers. It must establish its own structures of bureaucracy and tradition. According to Weber, this is how prophetic religious movements of the past created lasting regimes.

But the way that today’s charismatic leaders are chosen for their expressive qualities means they usually aren’t suited to consolidating power in this way. There is a remarkable contrast between the sweeping legislative program being enacted by the uncharismatic Biden presidency and Trump’s failure to deliver on most of his signature proposals.

This does not mean that the movements inspired by charismatic figures are irrelevant—far from it. They will continue to influence politics by reshaping the social and cultural context in which it unfolds. In fact, the potential for these movements is all the more dramatic because, as recent years have shown, they can appear almost out of thin air. We do not know who the next charismatic leaders will be until after they have been chosen.

Tradition with a capital T: Dylan at 80

It’s December 1963, and a roomful of liberal luminaries are gathered at New York’s Americana Hotel. They are here for the presentation of the Emergency Civil Liberties Committee’s prestigious Tom Paine Award, an accolade which, a year earlier, had been accepted by esteemed philosopher and anti-nuclear campaigner Bertrand Russell. If any in the audience have reservations about this year’s recipient, a 22-year-old folk singer called Bob Dylan, their skepticism will soon be vindicated. 

In what must rank as one of the most cack-handed acceptance speeches in history, an evidently drunk Dylan begins with a surreal digression about the attendees’ lack of hair, his way of saying that maybe it’s time they made room for some younger voices in politics. “You people should be at the beach,” he informs them, “just relaxing in the time you have to relax. It is not an old people’s world.” Not that it really matters anyway, since, as Dylan goes on to say, “There’s no black and white, left and right to me anymore; there’s only up and down… And I’m trying to go up without thinking of anything trivial such as politics.” Strange way to thank an organisation which barely survived the McCarthyite witch-hunts, but Dylan isn’t finished. To a mounting chorus of boos, he takes the opportunity to express sympathy for Lee Harvey Oswald, the assassin who had shot president John F. Kennedy less than a month earlier. “I have to be honest, I just have to be… I got to admit honestly that I, too, saw some of myself in him… Not to go that far and shoot…”

Stories like this one have a special status in the world of Bobology, or whatever we want to call the strange community-cum-industry of critics, fans and vinyl-collecting professors who have turned Dylan into a unique cultural phenomenon. The unacceptable acceptance speech at the Americana is among a handful of anecdotes that dramatize the most iconic time in his career – the mid-’60s period when Dylan rejected/ betrayed/ transcended (delete as you see fit) the folk movement and its social justice oriented vision of music. 

For the benefit of the uninitiated, Dylan made his name in the early ’60s as a politically engaged troubadour, writing protest anthems that became the soundtrack of the Civil Rights movement. He even performed as a warm-up act for Martin Luther King Jnr’s “I Have a Dream” speech at the 1963 March on Washington. Yet no sooner had Dylan been crowned “the conscience of a generation” than he started furiously trying to wriggle out of that role, most controversially through his embrace of rock music. In 1965, Dylan plugged in to play an electric set at the Newport Folk Festival (“the most written about performance in the history of rock,” writes biographer Clinton Heylin), leading to the wonderful though apocryphal story of folk stalwart Pete Seeger trying to cleave the sound cables with an axe. Another famous confrontation came at the Manchester Free Trade Hall in 1966, where angry folkies pelted Dylan with cries of “Judas!” (a moment whose magic really rests on Dylan’s response, as he turns around to his electric backing band and snarls “play it fuckin’ loud”). 

In the coming days, as the Bobologists celebrate their master’s 80th birthday, we’ll see how Dylan’s vast and elaborate legend remains anchored in this original sin of abandoning the folk community. I like the Tom Paine Award anecdote because it makes us recall that, for all his prodigious gifts, Dylan was little more than an adolescent when these events took place – a chaotic, moody, often petulant young man. What has come to define Dylan, in a sense, is a commonplace bout of youthful rebellion which has been elevated into a symbolic narrative about a transformative moment in cultural history. 

Still, we can hardly deny its power as a symbolic narrative. Numerous writers have claimed that Dylan’s rejection of folk marks a decisive turning point in the counterculture politics of ’60s, separating the collective purpose and idealism of the first half of the decade, as demonstrated in the March on Washington, from the bad acid trips, violent radicalism and disillusionment of the second. Hadn’t Dylan, through some uncanny intuition, sensed this descent into chaos? How else can we explain the radically different mood of his post-folk albums? The uplifting “Come gather ’round people/ Wherever you roam” is replaced by the sneering “How does it feel/ to be on your own,” and the hopeful “The answer, my friend, is blowin’ in the wind” by the cynical “You don’t need a weatherman to know which way the wind blows.” Or was Dylan, in fact, responsible for unleashing the furies of the late-’60s? That last lyric, after all, provided the name for the militant activist cell The Weathermen.

More profound still, Dylan’s mid-’60s transformation seemed to expose a deep fault line in the liberal worldview, a tension between two conceptions of freedom and authenticity. The folk movement saw itself in fundamentally egalitarian and collectivist terms, as a community of values whose progressive vision of the future was rooted in the shared inheritance of the folk tradition. Folkies were thus especially hostile to the rising tide of mass culture and consumerism in America. And clearly, had Dylan merely succumbed to the cringeworthy teenybopper rock ’n’ roll which was then topping the charts, he could have been written off as a sell-out. But Dylan’s first three rock records – the “Electric Trilogy” of Bringing It All Back HomeHighway 61 Revisited and Blonde on Blonde – are quite simply his best albums, and probably some of the best albums in the history of popular music. They didn’t just signal a move towards a wider market of consumers; they practically invented rock music as a sophisticated and artistically credible form. And the key to this was a seductive of vision of the artist as an individual set apart, an anarchic fount of creativity without earthly commitments, beholden only to the sublime visions of his own interior world. 

It was Dylan’s lyrical innovations, above all, that carried this vision. His new mode of social criticism, as heard in “Gates of Eden” and “It’s Alright, Ma (I’m Only Bleeding),” was savage and indiscriminate, condemning all alike and refusing to offer any answers. Redemption came in stead from the imaginative power of the words and images themselves – the artist’s transcendent “thought dreams,” his spontaneous “skippin’ reels of rhyme” – his ability to laugh, cry, love and express himself in the face of a bleak and inscrutable world.

Yes, to dance beneath the diamond sky with one hand waving free
Silhouetted by the sea, circled by the circus sands
With all memory and fate driven deep beneath the waves

Here is the fantasy of artistic individualism with which Dylan countered the idealism of folk music, raising a dilemma whose acuteness can still be felt in writing on the subject today. 

But for a certain kind of Dylan fan, to read so much into the break with folk is to miss the magician’s hand in the crafting of his own legend. Throughout his career, Dylan has shown a flair for mystifying his public image (some would say a flair for dishonesty). His original folksinger persona was precisely that – a persona he copied from his adolescent hero Woody Guthrie, from the pitch of his voice and his workman’s cap to the very idea of writing “topical” songs about social injustice. From his first arrival on the New York folk scene, Dylan intrigued the press with fabrications about his past, mostly involving running away from home, travelling with a circus and riding on freight trains. (He also managed to persuade one of his biographers, Robert Shelton, that he had spent time working as a prostitute, but the less said about that yarn the better). Likewise, Dylan’s subsequent persona as the poet of anarchy drew much of its effect from the drama of his split with the folk movement, and so its no surprise to find him fanning that drama, both at the time and long afterwards, with an array of facetious, hyperbolic and self-pitying comments about what he was doing. 

When the press tried to tap into Dylan’s motivations, he tended to swat them away with claims to the effect that he was just “a song and dance man,” a kind of false modesty (always delivered in a tone of preening arrogance) that fed his reputation for irreverence. He told the folksinger Joan Baez, among others, that his interest in protest songs had always been cynical – “You know me. I knew people would buy that kind of shit, right? I was never into that stuff” – despite numerous confidants from Dylan’s folk days insisting he had been obsessed with social justice. Later, in his book Chronicles: Volume One, Dylan made the opposite claim, insisting both his folk and post-folk phases reflected the same authentic calling: “All I’d ever done was sing songs that were dead straight and expressed powerful new realities. … My destiny lay down the road with whatever life invited, had nothing to do with representing any kind of civilisation.” He then complained (and note that modesty again): “It seems like the world has always needed a scapegoat – someone to lead the charge against the Roman Empire.” Incidentally, the “autobiographical” Chronicles is a masterpiece of self-mythologizing, where, among other sleights of hand, Dylan cuts back and forth between different stages of his career, neatly evading the question of how and why his worldview evolved.

Nor, of course, was Dylan’s break with folk his last act of reinvention. The rock phase lasted scarcely two years, after which he pivoted towards country music, first with the austere John Wesley Harding and then with the bittersweet Nashville Skyline. In the mid-1970s, Dylan recast himself as a travelling minstrel, complete with face paint and flower-decked hat, on the Rolling Thunder Revue tour. At the end of that decade he emerged as a born-again Christian playing gospel music, and shortly afterwards as an Infidel (releasing an album with that title). In the ’90s he appeared, among other guises, as a blues revivalist, while his more recent gestures include a kitsch Christmas album and a homage to Frank Sinatra. If there’s one line that manages to echo through the six decades of Dylan’s career, it must be “strike another match, go start anew.” 

This restless drive to wrong-foot his audience makes it tempting to see Dylan as a kind of prototype for the shape-shifting pop idol, anticipating the likes of David Bowie and Kate Bush, not to mention the countless fading stars who refresh their wardrobes and their political causes in a desperate clinging to relevance. Like so many readings of Dylan, this one inevitably doubles back, concertina-like, to the original break with folk. That episode can now be made to appear as the sudden rupture with tradition that gave birth to the postmodern celebrity, a paragon of mercurial autonomy whose image can be endlessly refashioned through the media.

But trying to fit Dylan into this template reveals precisely what is so distinctive about him. Alongside his capacity for inventing and reinventing himself as a cultural figure, there has always been a sincere and passionate devotion to the forms and traditions of the past. Each of the personae in Dylan’s long and winding musical innings – from folk troubadour to country singer to roadshow performer to bluesman to roots rocker to jazz crooner – has involved a deliberate engagement with some aspect of the American musical heritage, as well as with countless other cultural influences from the U.S. and beyond. This became most obvious from the ’90s onwards, with albums such as Good As I Been to You and World Gone Wrong, composed entirely of covers and traditional folk songs – not to mention “Love and Theft, a title whose quotation marks point to a book by historian Eric Lott, the subject of which, in turn, is the folklore of the American South. But these later works just made explicit what he had been doing all along.

“What I was into was traditional stuff with a capital T,” writes Dylan about his younger self in Chronicles. The unreliability of that book has already been mentioned, but the phrase is a neat way of describing his approach to borrowing from history. Dylan’s personae are never “traditional” in the sense of adhering devoutly to a moribund form; nor would it be quite right to say that he makes older styles his own. Rather, he treats tradition as an invitation to performance and pastiche, as though standing by the costume cupboard of history and trying on a series of eye-catching but not-quite-convincing disguises, always with a nod and a wink. I remember hearing Nashville Skyline for the first time and being slightly bemused at what sounded like an entirely artless imitation of country music; I was doubly bemused to learn this album had been recorded and released in 1969, the year of Woodstock and a year when Dylan was actually living in Woodstock. But it soon occurred to me that this was Dylan’s way of swimming against the tide. He may have lit the fuse of the high ’60s, but by the time the explosion came he had already moved on, not forward but back, recognising where his unique contribution as a musician really lay: in an ongoing dance with the spirits of the past, part eulogy and part pantomime. I then realised this same dance was happening in his earlier folk period, and in any number of his later chapters.

“The madly complicated modern world was something I took little interest in” – Chronicles again – “What was swinging, topical and up to date for me was stuff like the Titanic sinking, the Galveston flood, John Henry driving steel, John Hardy shooting a man on the West Virginia line.” We know this is at least partly true, because this overtly mythologized, larger-than-life history, this traditional stuff with a capital T, is never far away in Dylan’s music. The Titanic, great floods, folk heroes and wild-west outlaws all appear in his catalogue, usually with a few deliberate twists to imbue them with a more biblical grandeur, and to remind us not to take our narrator too seriously. It’s even plausible that he really did take time out from beatnik life in Greenwich Village to study 19th century newspapers at the New York Public Library, not “so much interested in the issues as intrigued by the language and rhetoric of the times.” Dylan is nothing if not a ventriloquist, using his various musical dummies to recall the languages of bygone eras. 

And if we look more closely at the Electric Trilogy, the infamous reinvention that sealed Dylan’s betrayal of folk, we find that much of the innovation on those albums fits into a twelve-bar blues structure, while their rhythms recall the R&B that Dylan had performed as a teenager in Hibbing, Minnesota. Likewise, it’s often been noted that their lyrical style, based on chains of loosely associated or juxtaposed images, shows not just the influence of the Beats, but also French symbolist poet Arthur Rimbaud, German radical playwright Bertolt Brecht, and bluesman Robert Johnson. This is to say nothing of the content of the lyrics, which feature an endless stream of allusions to history, literature, religion and myth. Songs like “Tombstone Blues” make an absurd parody of their own intertextuality (“The ghost of Belle Starr she hands down her wits/ To Jezebel the nun she violently knits/ A bald wig for Jack the Ripper who sits/ At the head of the chamber of commerce”). For all its iconoclasm, Dylan’s novel contribution to songwriting in this phase was to bring contemporary America into dialogue with a wider universe of cultural riches. 

Now consider this. Could it be that even Dylan’s disposable approach to his own persona, far from hearkening the arrival of the modern media star, is itself a tip of the hat to some older convention? The thought hadn’t occurred to me until I dipped into the latest round of Bobology marking Dylan’s 80th. There I found an intriguing lecture by the critic Greil Marcus about Dylan’s relationship to blues music (and it’s worth recalling that, by his own account, the young Dylan only arrived at folk music via the blues of Lead Belly and Odetta). “The blues,” says Marcus, “mandate that you present a story on the premise that it happened to you, so it has to be written [as] not autobiography but fiction.” He explains:

words first came from a common store of phrases, couplets, curses, blessings, jokes, greetings, and goodbyes that passed anonymously between blacks and whites after the Civil War. From that, the blues said, you craft a story, a philosophy lesson, that you present as your own: This happened to me. This is what I did. This is how it felt.

Is this where we find a synthesis of those two countervailing tendencies in Dylan’s career – on to the next character, back again to the “common store” of memories? Weaving a set of tropes into a fiction, which you then “present as your own,” certainly works as a description of how Dylan constructs his various artistic masks, not to mention many of his songs. It would be satisfying to imagine that this practice is itself a refashioned one – and as a way of understanding where Dylan is coming from, probably no less fictitious than all the others.

Terra damnata

A thousand regrets

The blossoms were ravenous, and wild.
They swallowed a streetlight and turned into a huge
glowing dandelion, snatching passers-by
in their intimate net of shadows. 

No one remembered how to approach 
such a vicious thing. Finally it fell into a mosaic
of shriveled tissue, gasping in the acrid glare.

All summer the wind was herding voices
behind its flat skies, and I was one of them – 
a voice telling you I was finally ready to leave. 

I was being called away, like an untethered balloon
destined to smile down on this empty 
corner of a chessboard. Then it was autumn
and still you nodded patiently. 

The cats abandoned their stares, their boxes
of sun snapped shut. Words dissolved into flat, wet 
steps, gongs against the dark drizzle.

How Napoleon made the British

In 1803, the poet and philosopher Samuel Taylor Coleridge wrote to a friend about his relish at the prospect of being invaded by Napoleon Bonaparte. “As to me, I think, the Invasion must be a Blessing,” he said, “For if we do not repel it, & cut them to pieces, we are a vile sunken race… And if we do act as Men, Christians, Englishmen – down goes the Corsican Miscreant, & Europe may have peace.”

This was during the great invasion scare, when Napoleon’s Army of England could on clear days be seen across the channel from Kent. Coleridge’s fighting talk captured the rash of patriotism that had broken out in Britain. The largest popular mobilisation of the entire Hanoverian era was set in motion, as some 400,000 men from Inverness to Cornwall entered volunteer militia units. London’s playhouses were overtaken by anti-French songs and plays, notably Shakespeare’s Henry V. Caricaturists such as James Gillray took a break from mocking King George III and focused on patriotic propaganda, contrasting the sturdy beef-eating Englishman John Bull with a puny, effete Napoleon.

These years were an important moment in the evolution of Britain’s identity, one that resonated through the 19th century and far beyond. The mission identified by Coleridge – to endure some ordeal as a vindication of national character, preferably without help from anyone else, and maybe benefit wider humanity as a by-product – anticipates a British exceptionalism that loomed throughout the Victorian era, reaching its final apotheosis in the Churchillian “if necessary alone” patriotism of the Second World War. Coleridge’s friend William Wordsworth expressed the same sentiment in 1806, after Napoleon had smashed the Prussian army at Jena, leaving the United Kingdom his only remaining opponent. “We are left, or shall be left, alone;/ The last that dare to struggle with the Foe,” Wordsworth wrote, “’Tis well! From this day forward we shall know/ That in ourselves our safety must be sought;/ That by our own right hands it must be wrought.”

As we mark the bicentennial of Napoleon’s death on St Helena in 1821, attention has naturally been focused on his legacy in France. But we shouldn’t forget that in his various guises – conquering general, founder of states and institutions, cultural icon – Napoleon transformed every part of Europe, and Britain was no exception. Yet the apparent national pride of the invasion scare was very far from the whole story. If the experience of fighting Napoleon left the British in important ways more cohesive, confident and powerful, it was largely because the country had previously looked like it was about to fall apart. 

Throughout the 1790s, as the French Revolution followed the twists and turns that eventually brought Napoleon to power, Britain was a tinder box. Ten years before he boasted of confronting Napoleon as “Men, Christians, Englishmen,” Coleridge had burned the words “Liberty” and “Equality” into the lawns of Cambridge university. Like Wordsworth, and like countless other radicals and republicans, he had embraced the Revolution as the dawn of a glorious new age in which the corrupt and oppressive ancien régime, including the Anglican establishment of Britain, would be swept away. 

And the tide of history seemed to be on the radicals’ side. The storming of the Bastille came less than a decade after Britain had lost its American colonies, while in George III the country had an unpopular king, prone to bouts of debilitating madness, whose scandalous sons appeared destined to drag the monarchy into disgrace. 

Support for the Revolution was strongest among Nonconformist Protestant sects – especially Unitarians, the so-called “rational Dissenters” – who formed the intellectual and commercial elite of cities such as Norwich, Birmingham and Manchester, and among the radical wing of the Whig party. But for the first time, educated working men also entered the political sphere en masse. They joined the Corresponding Societies which held public meetings and demonstrations across the country, so named because of their contacts with Jacobin counterparts in France. Influential Unitarian ministers, such as the Welsh philosopher Richard Price and the chemist Joseph Priestly, interpreted the Revolution as the work of providence and possibly a sign of the imminent Apocalypse. In the circle of Whig aristocrats around Charles James Fox, implacable adversary of William Pitt’s Tory government, the radicals had sympathisers at the highest levels of power. Fox famously said of the Revolution “how much the greatest event it is that ever happened in the world, and how much the best.”

From 1792 Britain was at war with revolutionary France, and this mix of new ideals and longstanding religious divides boiled over into mass unrest and fears of insurrection. In 1795 protestors smashed the windows at 10 Downing Street, and at the opening of parliament a crowd of 200,000 jeered at Pitt and George III. The radicals were met by an equally volatile loyalist reaction in defence of church and king. In 1793, a dinner celebrating Bastille Day in Birmingham sparked three days of rioting, including attacks on Nonconformist chapels and Priestly’s home. Pitt’s government introduced draconian limitations on thought, speech and association, although his attempt to convict members of the London Corresponding Society with high treason was foiled by a jury. 

Both sides drew inspiration from an intense pamphlet war that included some of the most iconic and controversial texts in British intellectual history. Conservatives were galvanised by Edmund Burke’s Reflections on the Revolution in France, a defence of England’s time-honoured social hierarchies, while radicals hailed Thomas Paine’s Rights of Man, calling for the abolition of Britain’s monarchy and aristocracy. When summoned on charges of seditious libel, Paine fled to Paris, where he sat in the National Assembly and continued to support the revolutionary regime despite almost being executed during the Reign of Terror that began in 1793. Among his supporters were the pioneering feminist Mary Wollstonecraft and the utopian progressive William Godwin, who shared an intellectual circle with Coleridge and Wordsworth. 

Britain seemed to be coming apart at the seams. Bad harvests at the turn of the century brought misery and renewed unrest, and the war effort failed to prevent France (under the leadership, from 1799, of First Consul Bonaparte) from dominating the continent. Paradoxically, nothing captures the paralysing divisions of the British state at this moment better than its expansion in 1801 to become the United Kingdom of Great Britain and Ireland. The annexation of Ireland was a symptom of weakness, not strength, since it reflected the threat posed by a bitterly divided and largely hostile satellite off Britain’s west coast. The only way to make it work, as Pitt insisted, was to grant political rights to Ireland’s Catholic majority – but George III refused. So Pitt resigned, and the Revolutionary Wars ended with the Treaty of Amiens in 1802, effectively acknowledging French victory.

Britain’s tensions and weaknesses certainly did not disappear during the ensuing, epic conflict with Napoleon from 1803-15. Violent social unrest continued to flare up, especially at times of harvest failure, financial crisis, and economic hardship resulting from restriction of trade with the continent. There were, at times, widespread demands for peace. The government continued to repress dissent with military force and legal measures; the radical poet and engraver William Blake (later rebranded as a patriotic figure when his words were used for the hymn Jerusalem) stood trial for sedition in 1803, following an altercation with two soldiers. Many of those who volunteered for local military units probably did so out of peer pressure and to avoid being impressed into the navy. Ireland, of course, would prove to be a more intractable problem than even Pitt had imagined.  

Nonetheless, Coleridge and Wordsworth’s transition from radicals to staunch patriots was emblematic. Whether the population at large was genuinely loyal or merely quiescent, Britain’s internal divisions lost much of their earlier ideological edge, and the threat of outright insurrection faded away. This process had already started in the 1790s, as many radicals shied away from the violence and militarism of revolutionary France, but it was galvanised by Napoleon. This was not just because he appeared determined and able to crush Britain, but also because of British perceptions of his regime. 

As Yale professor Stuart Semmel has observed, Napoleon did not fit neatly into the dichotomies with which Britain was used to contrasting itself against France. For the longest time, the opposition had been (roughly) “free Protestant constitutional monarchy” vs “Popish absolutist despotism”; after the Revolution, it had flipped to “Christian peace and order” vs “bloodthirsty atheism and chaos.” Napoleon threw these catagories into disarray. The British, says Semmel, had to ask “Was he a Jacobin or a king …; Italian or Frenchman; Catholic, atheist, or Muslim?” The religious uncertainty was especially unsettling, after Napoleon’s “declaration of kinship with Egyptian Muslims, his Concordat with the papacy, his tolerance for Protestants, and his convoking a Grand Sanhedrin of European Jews.” 

This may have forced some soul-searching on the part of the British as they struggled to define Napoleonic France, but in some respects the novelty simplified matters. Former radicals could argue Napoleon represented a betrayal of the Revolution, and could agree with loyalists that he was a tyrant bent on personal domination of Europe, thus drawing a line under the ideological passions of the revolutionary period. In any case, loyalist propaganda had no difficulty transferring to Napoleon the template traditionally reserved for the Pope – that of the biblical Antichrist. This simple fact of having a single infamous figure on which to focus patriotic feelings no doubt aided national unity. As the essayist William Hazlitt, an enduring supporter of Napoleon, later noted: “Everybody knows that it is only necessary to raise a bugbear before the English imagination in order to govern it at will.”

More subtly, conservatives introduced the concept of “legitimacy” to the political lexicon, to distinguish the hereditary power of British monarchs from Napoleon’s usurpation of the Bourbon throne. This was rank hypocrisy, given the British elite’s habit of importing a new dynasty whenever it suited them, but it played to an attitude which did help to unify the nation: during the conflict with Napoleon, people could feel that they were defending the British system in general, rather than supporting the current government or waging an ideological war against the Revolution. The resulting change of sentiment could be seen in 1809, when there were vast celebrations to mark the Golden Jubilee of the once unpopular George III. 

Undoubtedly British culture was also transformed by admiration for Napoleon, especially among artists, intellectuals and Whigs, yet even here the tendency was towards calming antagonisms rather than enflaming them. This period saw the ascendance of Romanticism in European culture and ways of thinking, and there was not and never would be a greater Romantic hero than Napoleon, who had turned the world upside down through force of will and what Victor Hugo later called “supernatural instinct.” But ultimately this meant aestheticizing Napoleon, removing him from the sphere of politics to that of sentiment, imagination and history. Thus when Napoleon abdicated his throne in 1814, the admiring poet Lord Byron was mostly disappointed he had not fulfilled his dramatic potential by committing suicide. 

But Napoleon profoundly reshaped Britain in another way: the long and grueling conflict against him left a lasting stamp on every aspect of the British state. In short, while no-one could have reasonably predicted victory until Napoleon’s catastrophic invasion of Russia in 1812, the war was nonetheless crucial in forging Britain into the global superpower it would become after 1815. 

The British had long been in the habit of fighting wars with ships and money rather than armies, and for the most part this was true of the Napoleonic wars as well. But the unprecedented demands of this conflict led to an equally unprecedented development of Britain’s financial system. This started with the introduction of new property taxes and, in 1799, the first income tax, which were continually raised until by 1814 their yield had increased by a factor of ten. What mattered here was not so much the immediate revenue as the unparalleled fiscal base it gave Britain for the purpose of borrowing money – which it did, prodigiously. In 1804, the year Bonaparte was crowned Emperor, the “Napoleon of finance” Nathan Rothschild arrived in London from Frankfurt, helping to secure a century of British hegemony in the global financial system. 

No less significant were the effects of war in stimulating Britain’s nascent industrial revolution, and its accompanying commercial empire. The state relied on private contractors for most of its materiel, especially that required to build and maintain the vast Royal Navy, while creating immense demand for iron, coal and timber. In 1814, when rulers and representatives of Britain’s European allies came to Portsmouth, they were shown a startling vision of the future: enormous factories where pulley blocks for the rigging of warships were being mass-produced with steam-driven machine tools. Meanwhile Napoleon’s Continental System, by shutting British manufacturers and exporters out of Europe, forced them to develop markets in South Asia, Africa and Latin America. 

Even Britain’s fabled “liberal” constitution – the term was taken from Spanish opponents to Napoleon – did in fact do some of the organic adaptation that smug Victorians would later claim as its hallmark. The Nonconformist middle classes, so subversive during the revolutionary period, were courted in 1812-13 with greater political rights and by the relaxation of various restrictions on trade. Meanwhile, Britain discovered what would become its greatest moral crusade of the 19thcentury. Napoleon’s reintroduction of slavery in France’s Caribbean colonies created the conditions for abolitionism to grow as a popular movement in Britain, since, as William Wilberforce argued, “we should not give advantages to our enemies.” Two bills in 1806-7 effectively ended Britain’s centuries-long participation in the trans-Atlantic slave trade.

Thus Napoleon was not just a hurdle to be cleared en route to the British century – he was, with all his charisma and ruthless determination, a formative element in the nation’s history. And his influence did not end with his death in 1821, of course. He would long haunt the Romantic Victorian imagination as, in Eric Hobsbawm’s words, “the figure every man who broke with tradition could identify himself with.”

Europe’s vaccine cooperation isn’t federalist dogma

This week, Douglas Murray argued that the on-going shambles of the European Union’s vaccination effort should be chalked up to federalist dogma. “There is no logical reason why EU countries could not have been allowed to pursue independent vaccine development, procurement and roll-out,” wrote Murray, except that “it has already been decided that an EU-wide approach is always the only approach.”

In fact, there are two rather large reasons that richer EU nations, who would have been better off going it alone on vaccines, chose not to: Russia and China. As Jens Spahn, the German health minister, told the Bundestag in January: “If our Eastern and Southern European partners had not received a vaccine through the EU, who would likely have stepped in? China? Russia? Would we have preferred that?”

Yes, the European Commission under Ursula von der Leyen surely saw vaccines as an opportunity to grab more powers for Brussels, but the member states still had to be persuaded. Germany, France, Italy and the Netherlands had already formed their own “Vaccine Alliance.” No doubt what made them acquiesce to von der Leyen was the nightmare scenario of returning to the dynamics which took shape during the first phase of the pandemic. Then, as European countries were squabbling over protective equipment, Russia and China had swooped in with offers of support in the Balkans and in Eastern and Central Europe, areas where they have long been trying to boost their influence. 

Even in the heart of the EU, an Italian ambassador complained last March that “not a single EU country” had responded to his nation’s pleas for assistance – only China had. In the event, Europe’s failings appeared less drastic when much of the equipment delivered by China proved faulty.   

But if last year’s “mask diplomacy” was fraught, vaccine diplomacy on the continent would have been explosive. Western Europe would have fared much better procuring vaccines for itself than under the Commission’s inept strategy. But as peripheral states fell behind they would have proclaimed European solidarity dead, while using the prospect of increased Russian and Chinese influence in their countries to blackmail the richer states for vaccines. Even with its collective approach, the EU has not been able to prevent Eastern Europe from using or threatening to use Sputnik and Sinopharm vaccines from Russia and China.

Von der Leyen’s Commission must carry the blame for its dire performance on vaccines. But Murray’s assertion that “the federalists” are responsible for the EU agreeing to a common strategy reflects a very British obsession with European integration. This was more understandable when we were still part of the EU, but it now risks clouding our judgment regarding the trade-offs our neighbours on the continent face. Under German and French leadership, Europe is trying to maintain a precarious balance of competition and trade with Russia and China, and the allegiance of Europe’s eastern periphery is crucial to that balance. This dilemma would continue to exist even if the EU did not. 

We Brits need to develop a more nuanced understanding of what motivates cooperation in Europe. It is more than the sentimental solidarity imagined by Remainers or the ideology of ever closer union feared by Eurosceptics. It is just as often the need to balance competing interests and ambitions in what remains a fractious continent.