Tradition with a capital T: Dylan at 80

It’s December 1963, and a roomful of liberal luminaries are gathered at New York’s Americana Hotel. They are here for the presentation of the Emergency Civil Liberties Committee’s prestigious Tom Paine Award, an accolade which, a year earlier, had been accepted by esteemed philosopher and anti-nuclear campaigner Bertrand Russell. If any in the audience have reservations about this year’s recipient, a 22-year-old folk singer called Bob Dylan, their skepticism will soon be vindicated. 

In what must rank as one of the most cack-handed acceptance speeches in history, an evidently drunk Dylan begins with a surreal digression about the attendees’ lack of hair, his way of saying that maybe it’s time they made room for some younger voices in politics. “You people should be at the beach,” he informs them, “just relaxing in the time you have to relax. It is not an old people’s world.” Not that it really matters anyway, since, as Dylan goes on to say, “There’s no black and white, left and right to me anymore; there’s only up and down… And I’m trying to go up without thinking of anything trivial such as politics.” Strange way to thank an organisation which barely survived the McCarthyite witch-hunts, but Dylan isn’t finished. To a mounting chorus of boos, he takes the opportunity to express sympathy for Lee Harvey Oswald, the assassin who had shot president John F. Kennedy less than a month earlier. “I have to be honest, I just have to be… I got to admit honestly that I, too, saw some of myself in him… Not to go that far and shoot…”

Stories like this one have a special status in the world of Bobology, or whatever we want to call the strange community-cum-industry of critics, fans and vinyl-collecting professors who have turned Dylan into a unique cultural phenomenon. The unacceptable acceptance speech at the Americana is among a handful of anecdotes that dramatize the most iconic time in his career – the mid-’60s period when Dylan rejected/ betrayed/ transcended (delete as you see fit) the folk movement and its social justice oriented vision of music. 

For the benefit of the uninitiated, Dylan made his name in the early ’60s as a politically engaged troubadour, writing protest anthems that became the soundtrack of the Civil Rights movement. He even performed as a warm-up act for Martin Luther King Jnr’s “I Have a Dream” speech at the 1963 March on Washington. Yet no sooner had Dylan been crowned “the conscience of a generation” than he started furiously trying to wriggle out of that role, most controversially through his embrace of rock music. In 1965, Dylan plugged in to play an electric set at the Newport Folk Festival (“the most written about performance in the history of rock,” writes biographer Clinton Heylin), leading to the wonderful though apocryphal story of folk stalwart Pete Seeger trying to cleave the sound cables with an axe. Another famous confrontation came at the Manchester Free Trade Hall in 1966, where angry folkies pelted Dylan with cries of “Judas!” (a moment whose magic really rests on Dylan’s response, as he turns around to his electric backing band and snarls “play it fuckin’ loud”). 

In the coming days, as the Bobologists celebrate their master’s 80th birthday, we’ll see how Dylan’s vast and elaborate legend remains anchored in this original sin of abandoning the folk community. I like the Tom Paine Award anecdote because it makes us recall that, for all his prodigious gifts, Dylan was little more than an adolescent when these events took place – a chaotic, moody, often petulant young man. What has come to define Dylan, in a sense, is a commonplace bout of youthful rebellion which has been elevated into a symbolic narrative about a transformative moment in cultural history. 

Still, we can hardly deny its power as a symbolic narrative. Numerous writers have claimed that Dylan’s rejection of folk marks a decisive turning point in the counterculture politics of ’60s, separating the collective purpose and idealism of the first half of the decade, as demonstrated in the March on Washington, from the bad acid trips, violent radicalism and disillusionment of the second. Hadn’t Dylan, through some uncanny intuition, sensed this descent into chaos? How else can we explain the radically different mood of his post-folk albums? The uplifting “Come gather ’round people/ Wherever you roam” is replaced by the sneering “How does it feel/ to be on your own,” and the hopeful “The answer, my friend, is blowin’ in the wind” by the cynical “You don’t need a weatherman to know which way the wind blows.” Or was Dylan, in fact, responsible for unleashing the furies of the late-’60s? That last lyric, after all, provided the name for the militant activist cell The Weathermen.

More profound still, Dylan’s mid-’60s transformation seemed to expose a deep fault line in the liberal worldview, a tension between two conceptions of freedom and authenticity. The folk movement saw itself in fundamentally egalitarian and collectivist terms, as a community of values whose progressive vision of the future was rooted in the shared inheritance of the folk tradition. Folkies were thus especially hostile to the rising tide of mass culture and consumerism in America. And clearly, had Dylan merely succumbed to the cringeworthy teenybopper rock ’n’ roll which was then topping the charts, he could have been written off as a sell-out. But Dylan’s first three rock records – the “Electric Trilogy” of Bringing It All Back HomeHighway 61 Revisited and Blonde on Blonde – are quite simply his best albums, and probably some of the best albums in the history of popular music. They didn’t just signal a move towards a wider market of consumers; they practically invented rock music as a sophisticated and artistically credible form. And the key to this was a seductive of vision of the artist as an individual set apart, an anarchic fount of creativity without earthly commitments, beholden only to the sublime visions of his own interior world. 

It was Dylan’s lyrical innovations, above all, that carried this vision. His new mode of social criticism, as heard in “Gates of Eden” and “It’s Alright, Ma (I’m Only Bleeding),” was savage and indiscriminate, condemning all alike and refusing to offer any answers. Redemption came in stead from the imaginative power of the words and images themselves – the artist’s transcendent “thought dreams,” his spontaneous “skippin’ reels of rhyme” – his ability to laugh, cry, love and express himself in the face of a bleak and inscrutable world.

Yes, to dance beneath the diamond sky with one hand waving free
Silhouetted by the sea, circled by the circus sands
With all memory and fate driven deep beneath the waves

Here is the fantasy of artistic individualism with which Dylan countered the idealism of folk music, raising a dilemma whose acuteness can still be felt in writing on the subject today. 

But for a certain kind of Dylan fan, to read so much into the break with folk is to miss the magician’s hand in the crafting of his own legend. Throughout his career, Dylan has shown a flair for mystifying his public image (some would say a flair for dishonesty). His original folksinger persona was precisely that – a persona he copied from his adolescent hero Woody Guthrie, from the pitch of his voice and his workman’s cap to the very idea of writing “topical” songs about social injustice. From his first arrival on the New York folk scene, Dylan intrigued the press with fabrications about his past, mostly involving running away from home, travelling with a circus and riding on freight trains. (He also managed to persuade one of his biographers, Robert Shelton, that he had spent time working as a prostitute, but the less said about that yarn the better). Likewise, Dylan’s subsequent persona as the poet of anarchy drew much of its effect from the drama of his split with the folk movement, and so its no surprise to find him fanning that drama, both at the time and long afterwards, with an array of facetious, hyperbolic and self-pitying comments about what he was doing. 

When the press tried to tap into Dylan’s motivations, he tended to swat them away with claims to the effect that he was just “a song and dance man,” a kind of false modesty (always delivered in a tone of preening arrogance) that fed his reputation for irreverence. He told the folksinger Joan Baez, among others, that his interest in protest songs had always been cynical – “You know me. I knew people would buy that kind of shit, right? I was never into that stuff” – despite numerous confidants from Dylan’s folk days insisting he had been obsessed with social justice. Later, in his book Chronicles: Volume One, Dylan made the opposite claim, insisting both his folk and post-folk phases reflected the same authentic calling: “All I’d ever done was sing songs that were dead straight and expressed powerful new realities. … My destiny lay down the road with whatever life invited, had nothing to do with representing any kind of civilisation.” He then complained (and note that modesty again): “It seems like the world has always needed a scapegoat – someone to lead the charge against the Roman Empire.” Incidentally, the “autobiographical” Chronicles is a masterpiece of self-mythologizing, where, among other sleights of hand, Dylan cuts back and forth between different stages of his career, neatly evading the question of how and why his worldview evolved.

Nor, of course, was Dylan’s break with folk his last act of reinvention. The rock phase lasted scarcely two years, after which he pivoted towards country music, first with the austere John Wesley Harding and then with the bittersweet Nashville Skyline. In the mid-1970s, Dylan recast himself as a travelling minstrel, complete with face paint and flower-decked hat, on the Rolling Thunder Revue tour. At the end of that decade he emerged as a born-again Christian playing gospel music, and shortly afterwards as an Infidel (releasing an album with that title). In the ’90s he appeared, among other guises, as a blues revivalist, while his more recent gestures include a kitsch Christmas album and a homage to Frank Sinatra. If there’s one line that manages to echo through the six decades of Dylan’s career, it must be “strike another match, go start anew.” 

This restless drive to wrong-foot his audience makes it tempting to see Dylan as a kind of prototype for the shape-shifting pop idol, anticipating the likes of David Bowie and Kate Bush, not to mention the countless fading stars who refresh their wardrobes and their political causes in a desperate clinging to relevance. Like so many readings of Dylan, this one inevitably doubles back, concertina-like, to the original break with folk. That episode can now be made to appear as the sudden rupture with tradition that gave birth to the postmodern celebrity, a paragon of mercurial autonomy whose image can be endlessly refashioned through the media.

But trying to fit Dylan into this template reveals precisely what is so distinctive about him. Alongside his capacity for inventing and reinventing himself as a cultural figure, there has always been a sincere and passionate devotion to the forms and traditions of the past. Each of the personae in Dylan’s long and winding musical innings – from folk troubadour to country singer to roadshow performer to bluesman to roots rocker to jazz crooner – has involved a deliberate engagement with some aspect of the American musical heritage, as well as with countless other cultural influences from the U.S. and beyond. This became most obvious from the ’90s onwards, with albums such as Good As I Been to You and World Gone Wrong, composed entirely of covers and traditional folk songs – not to mention “Love and Theft, a title whose quotation marks point to a book by historian Eric Lott, the subject of which, in turn, is the folklore of the American South. But these later works just made explicit what he had been doing all along.

“What I was into was traditional stuff with a capital T,” writes Dylan about his younger self in Chronicles. The unreliability of that book has already been mentioned, but the phrase is a neat way of describing his approach to borrowing from history. Dylan’s personae are never “traditional” in the sense of adhering devoutly to a moribund form; nor would it be quite right to say that he makes older styles his own. Rather, he treats tradition as an invitation to performance and pastiche, as though standing by the costume cupboard of history and trying on a series of eye-catching but not-quite-convincing disguises, always with a nod and a wink. I remember hearing Nashville Skyline for the first time and being slightly bemused at what sounded like an entirely artless imitation of country music; I was doubly bemused to learn this album had been recorded and released in 1969, the year of Woodstock and a year when Dylan was actually living in Woodstock. But it soon occurred to me that this was Dylan’s way of swimming against the tide. He may have lit the fuse of the high ’60s, but by the time the explosion came he had already moved on, not forward but back, recognising where his unique contribution as a musician really lay: in an ongoing dance with the spirits of the past, part eulogy and part pantomime. I then realised this same dance was happening in his earlier folk period, and in any number of his later chapters.

“The madly complicated modern world was something I took little interest in” – Chronicles again – “What was swinging, topical and up to date for me was stuff like the Titanic sinking, the Galveston flood, John Henry driving steel, John Hardy shooting a man on the West Virginia line.” We know this is at least partly true, because this overtly mythologized, larger-than-life history, this traditional stuff with a capital T, is never far away in Dylan’s music. The Titanic, great floods, folk heroes and wild-west outlaws all appear in his catalogue, usually with a few deliberate twists to imbue them with a more biblical grandeur, and to remind us not to take our narrator too seriously. It’s even plausible that he really did take time out from beatnik life in Greenwich Village to study 19th century newspapers at the New York Public Library, not “so much interested in the issues as intrigued by the language and rhetoric of the times.” Dylan is nothing if not a ventriloquist, using his various musical dummies to recall the languages of bygone eras. 

And if we look more closely at the Electric Trilogy, the infamous reinvention that sealed Dylan’s betrayal of folk, we find that much of the innovation on those albums fits into a twelve-bar blues structure, while their rhythms recall the R&B that Dylan had performed as a teenager in Hibbing, Minnesota. Likewise, it’s often been noted that their lyrical style, based on chains of loosely associated or juxtaposed images, shows not just the influence of the Beats, but also French symbolist poet Arthur Rimbaud, German radical playwright Bertolt Brecht, and bluesman Robert Johnson. This is to say nothing of the content of the lyrics, which feature an endless stream of allusions to history, literature, religion and myth. Songs like “Tombstone Blues” make an absurd parody of their own intertextuality (“The ghost of Belle Starr she hands down her wits/ To Jezebel the nun she violently knits/ A bald wig for Jack the Ripper who sits/ At the head of the chamber of commerce”). For all its iconoclasm, Dylan’s novel contribution to songwriting in this phase was to bring contemporary America into dialogue with a wider universe of cultural riches. 

Now consider this. Could it be that even Dylan’s disposable approach to his own persona, far from hearkening the arrival of the modern media star, is itself a tip of the hat to some older convention? The thought hadn’t occurred to me until I dipped into the latest round of Bobology marking Dylan’s 80th. There I found an intriguing lecture by the critic Greil Marcus about Dylan’s relationship to blues music (and it’s worth recalling that, by his own account, the young Dylan only arrived at folk music via the blues of Lead Belly and Odetta). “The blues,” says Marcus, “mandate that you present a story on the premise that it happened to you, so it has to be written [as] not autobiography but fiction.” He explains:

words first came from a common store of phrases, couplets, curses, blessings, jokes, greetings, and goodbyes that passed anonymously between blacks and whites after the Civil War. From that, the blues said, you craft a story, a philosophy lesson, that you present as your own: This happened to me. This is what I did. This is how it felt.

Is this where we find a synthesis of those two countervailing tendencies in Dylan’s career – on to the next character, back again to the “common store” of memories? Weaving a set of tropes into a fiction, which you then “present as your own,” certainly works as a description of how Dylan constructs his various artistic masks, not to mention many of his songs. It would be satisfying to imagine that this practice is itself a refashioned one – and as a way of understanding where Dylan is coming from, probably no less fictitious than all the others.

Terra damnata

A thousand regrets

The blossoms were ravenous, and wild.
They swallowed a streetlight and turned into a huge
glowing dandelion, snatching passers-by
in their intimate net of shadows. 

No one remembered how to approach 
such a vicious thing. Finally it fell into a mosaic
of shriveled tissue, gasping in the acrid glare.

All summer the wind was herding voices
behind its flat skies, and I was one of them – 
a voice telling you I was finally ready to leave. 

I was being called away, like an untethered balloon
destined to smile down on this empty 
corner of a chessboard. Then it was autumn
and still you nodded patiently. 

The cats abandoned their stares, their boxes
of sun snapped shut. Words dissolved into flat, wet 
steps, gongs against the dark drizzle.


How Napoleon made the British

In 1803, the poet and philosopher Samuel Taylor Coleridge wrote to a friend about his relish at the prospect of being invaded by Napoleon Bonaparte. “As to me, I think, the Invasion must be a Blessing,” he said, “For if we do not repel it, & cut them to pieces, we are a vile sunken race… And if we do act as Men, Christians, Englishmen – down goes the Corsican Miscreant, & Europe may have peace.”

This was during the great invasion scare, when Napoleon’s Army of England could on clear days be seen across the channel from Kent. Coleridge’s fighting talk captured the rash of patriotism that had broken out in Britain. The largest popular mobilisation of the entire Hanoverian era was set in motion, as some 400,000 men from Inverness to Cornwall entered volunteer militia units. London’s playhouses were overtaken by anti-French songs and plays, notably Shakespeare’s Henry V. Caricaturists such as James Gillray took a break from mocking King George III and focused on patriotic propaganda, contrasting the sturdy beef-eating Englishman John Bull with a puny, effete Napoleon.

These years were an important moment in the evolution of Britain’s identity, one that resonated through the 19th century and far beyond. The mission identified by Coleridge – to endure some ordeal as a vindication of national character, preferably without help from anyone else, and maybe benefit wider humanity as a by-product – anticipates a British exceptionalism that loomed throughout the Victorian era, reaching its final apotheosis in the Churchillian “if necessary alone” patriotism of the Second World War. Coleridge’s friend William Wordsworth expressed the same sentiment in 1806, after Napoleon had smashed the Prussian army at Jena, leaving the United Kingdom his only remaining opponent. “We are left, or shall be left, alone;/ The last that dare to struggle with the Foe,” Wordsworth wrote, “’Tis well! From this day forward we shall know/ That in ourselves our safety must be sought;/ That by our own right hands it must be wrought.”

As we mark the bicentennial of Napoleon’s death on St Helena in 1821, attention has naturally been focused on his legacy in France. But we shouldn’t forget that in his various guises – conquering general, founder of states and institutions, cultural icon – Napoleon transformed every part of Europe, and Britain was no exception. Yet the apparent national pride of the invasion scare was very far from the whole story. If the experience of fighting Napoleon left the British in important ways more cohesive, confident and powerful, it was largely because the country had previously looked like it was about to fall apart. 

Throughout the 1790s, as the French Revolution followed the twists and turns that eventually brought Napoleon to power, Britain was a tinder box. Ten years before he boasted of confronting Napoleon as “Men, Christians, Englishmen,” Coleridge had burned the words “Liberty” and “Equality” into the lawns of Cambridge university. Like Wordsworth, and like countless other radicals and republicans, he had embraced the Revolution as the dawn of a glorious new age in which the corrupt and oppressive ancien régime, including the Anglican establishment of Britain, would be swept away. 

And the tide of history seemed to be on the radicals’ side. The storming of the Bastille came less than a decade after Britain had lost its American colonies, while in George III the country had an unpopular king, prone to bouts of debilitating madness, whose scandalous sons appeared destined to drag the monarchy into disgrace. 

Support for the Revolution was strongest among Nonconformist Protestant sects – especially Unitarians, the so-called “rational Dissenters” – who formed the intellectual and commercial elite of cities such as Norwich, Birmingham and Manchester, and among the radical wing of the Whig party. But for the first time, educated working men also entered the political sphere en masse. They joined the Corresponding Societies which held public meetings and demonstrations across the country, so named because of their contacts with Jacobin counterparts in France. Influential Unitarian ministers, such as the Welsh philosopher Richard Price and the chemist Joseph Priestly, interpreted the Revolution as the work of providence and possibly a sign of the imminent Apocalypse. In the circle of Whig aristocrats around Charles James Fox, implacable adversary of William Pitt’s Tory government, the radicals had sympathisers at the highest levels of power. Fox famously said of the Revolution “how much the greatest event it is that ever happened in the world, and how much the best.”

From 1792 Britain was at war with revolutionary France, and this mix of new ideals and longstanding religious divides boiled over into mass unrest and fears of insurrection. In 1795 protestors smashed the windows at 10 Downing Street, and at the opening of parliament a crowd of 200,000 jeered at Pitt and George III. The radicals were met by an equally volatile loyalist reaction in defence of church and king. In 1793, a dinner celebrating Bastille Day in Birmingham sparked three days of rioting, including attacks on Nonconformist chapels and Priestly’s home. Pitt’s government introduced draconian limitations on thought, speech and association, although his attempt to convict members of the London Corresponding Society with high treason was foiled by a jury. 

Both sides drew inspiration from an intense pamphlet war that included some of the most iconic and controversial texts in British intellectual history. Conservatives were galvanised by Edmund Burke’s Reflections on the Revolution in France, a defence of England’s time-honoured social hierarchies, while radicals hailed Thomas Paine’s Rights of Man, calling for the abolition of Britain’s monarchy and aristocracy. When summoned on charges of seditious libel, Paine fled to Paris, where he sat in the National Assembly and continued to support the revolutionary regime despite almost being executed during the Reign of Terror that began in 1793. Among his supporters were the pioneering feminist Mary Wollstonecraft and the utopian progressive William Godwin, who shared an intellectual circle with Coleridge and Wordsworth. 

Britain seemed to be coming apart at the seams. Bad harvests at the turn of the century brought misery and renewed unrest, and the war effort failed to prevent France (under the leadership, from 1799, of First Consul Bonaparte) from dominating the continent. Paradoxically, nothing captures the paralysing divisions of the British state at this moment better than its expansion in 1801 to become the United Kingdom of Great Britain and Ireland. The annexation of Ireland was a symptom of weakness, not strength, since it reflected the threat posed by a bitterly divided and largely hostile satellite off Britain’s west coast. The only way to make it work, as Pitt insisted, was to grant political rights to Ireland’s Catholic majority – but George III refused. So Pitt resigned, and the Revolutionary Wars ended with the Treaty of Amiens in 1802, effectively acknowledging French victory.

Britain’s tensions and weaknesses certainly did not disappear during the ensuing, epic conflict with Napoleon from 1803-15. Violent social unrest continued to flare up, especially at times of harvest failure, financial crisis, and economic hardship resulting from restriction of trade with the continent. There were, at times, widespread demands for peace. The government continued to repress dissent with military force and legal measures; the radical poet and engraver William Blake (later rebranded as a patriotic figure when his words were used for the hymn Jerusalem) stood trial for sedition in 1803, following an altercation with two soldiers. Many of those who volunteered for local military units probably did so out of peer pressure and to avoid being impressed into the navy. Ireland, of course, would prove to be a more intractable problem than even Pitt had imagined.  

Nonetheless, Coleridge and Wordsworth’s transition from radicals to staunch patriots was emblematic. Whether the population at large was genuinely loyal or merely quiescent, Britain’s internal divisions lost much of their earlier ideological edge, and the threat of outright insurrection faded away. This process had already started in the 1790s, as many radicals shied away from the violence and militarism of revolutionary France, but it was galvanised by Napoleon. This was not just because he appeared determined and able to crush Britain, but also because of British perceptions of his regime. 

As Yale professor Stuart Semmel has observed, Napoleon did not fit neatly into the dichotomies with which Britain was used to contrasting itself against France. For the longest time, the opposition had been (roughly) “free Protestant constitutional monarchy” vs “Popish absolutist despotism”; after the Revolution, it had flipped to “Christian peace and order” vs “bloodthirsty atheism and chaos.” Napoleon threw these catagories into disarray. The British, says Semmel, had to ask “Was he a Jacobin or a king …; Italian or Frenchman; Catholic, atheist, or Muslim?” The religious uncertainty was especially unsettling, after Napoleon’s “declaration of kinship with Egyptian Muslims, his Concordat with the papacy, his tolerance for Protestants, and his convoking a Grand Sanhedrin of European Jews.” 

This may have forced some soul-searching on the part of the British as they struggled to define Napoleonic France, but in some respects the novelty simplified matters. Former radicals could argue Napoleon represented a betrayal of the Revolution, and could agree with loyalists that he was a tyrant bent on personal domination of Europe, thus drawing a line under the ideological passions of the revolutionary period. In any case, loyalist propaganda had no difficulty transferring to Napoleon the template traditionally reserved for the Pope – that of the biblical Antichrist. This simple fact of having a single infamous figure on which to focus patriotic feelings no doubt aided national unity. As the essayist William Hazlitt, an enduring supporter of Napoleon, later noted: “Everybody knows that it is only necessary to raise a bugbear before the English imagination in order to govern it at will.”

More subtly, conservatives introduced the concept of “legitimacy” to the political lexicon, to distinguish the hereditary power of British monarchs from Napoleon’s usurpation of the Bourbon throne. This was rank hypocrisy, given the British elite’s habit of importing a new dynasty whenever it suited them, but it played to an attitude which did help to unify the nation: during the conflict with Napoleon, people could feel that they were defending the British system in general, rather than supporting the current government or waging an ideological war against the Revolution. The resulting change of sentiment could be seen in 1809, when there were vast celebrations to mark the Golden Jubilee of the once unpopular George III. 

Undoubtedly British culture was also transformed by admiration for Napoleon, especially among artists, intellectuals and Whigs, yet even here the tendency was towards calming antagonisms rather than enflaming them. This period saw the ascendance of Romanticism in European culture and ways of thinking, and there was not and never would be a greater Romantic hero than Napoleon, who had turned the world upside down through force of will and what Victor Hugo later called “supernatural instinct.” But ultimately this meant aestheticizing Napoleon, removing him from the sphere of politics to that of sentiment, imagination and history. Thus when Napoleon abdicated his throne in 1814, the admiring poet Lord Byron was mostly disappointed he had not fulfilled his dramatic potential by committing suicide. 

But Napoleon profoundly reshaped Britain in another way: the long and grueling conflict against him left a lasting stamp on every aspect of the British state. In short, while no-one could have reasonably predicted victory until Napoleon’s catastrophic invasion of Russia in 1812, the war was nonetheless crucial in forging Britain into the global superpower it would become after 1815. 

The British had long been in the habit of fighting wars with ships and money rather than armies, and for the most part this was true of the Napoleonic wars as well. But the unprecedented demands of this conflict led to an equally unprecedented development of Britain’s financial system. This started with the introduction of new property taxes and, in 1799, the first income tax, which were continually raised until by 1814 their yield had increased by a factor of ten. What mattered here was not so much the immediate revenue as the unparalleled fiscal base it gave Britain for the purpose of borrowing money – which it did, prodigiously. In 1804, the year Bonaparte was crowned Emperor, the “Napoleon of finance” Nathan Rothschild arrived in London from Frankfurt, helping to secure a century of British hegemony in the global financial system. 

No less significant were the effects of war in stimulating Britain’s nascent industrial revolution, and its accompanying commercial empire. The state relied on private contractors for most of its materiel, especially that required to build and maintain the vast Royal Navy, while creating immense demand for iron, coal and timber. In 1814, when rulers and representatives of Britain’s European allies came to Portsmouth, they were shown a startling vision of the future: enormous factories where pulley blocks for the rigging of warships were being mass-produced with steam-driven machine tools. Meanwhile Napoleon’s Continental System, by shutting British manufacturers and exporters out of Europe, forced them to develop markets in South Asia, Africa and Latin America. 

Even Britain’s fabled “liberal” constitution – the term was taken from Spanish opponents to Napoleon – did in fact do some of the organic adaptation that smug Victorians would later claim as its hallmark. The Nonconformist middle classes, so subversive during the revolutionary period, were courted in 1812-13 with greater political rights and by the relaxation of various restrictions on trade. Meanwhile, Britain discovered what would become its greatest moral crusade of the 19thcentury. Napoleon’s reintroduction of slavery in France’s Caribbean colonies created the conditions for abolitionism to grow as a popular movement in Britain, since, as William Wilberforce argued, “we should not give advantages to our enemies.” Two bills in 1806-7 effectively ended Britain’s centuries-long participation in the trans-Atlantic slave trade.

Thus Napoleon was not just a hurdle to be cleared en route to the British century – he was, with all his charisma and ruthless determination, a formative element in the nation’s history. And his influence did not end with his death in 1821, of course. He would long haunt the Romantic Victorian imagination as, in Eric Hobsbawm’s words, “the figure every man who broke with tradition could identify himself with.”

The age of mass timber: why we should build in wood

This article was published by The Critic on March 10th 2021.

There are few more evocative images of modernity than the glittering skyscrapers of Tokyo. It’s easy to forget that Japan’s cities used to consist largely of timber structures up until the mid-twentieth century. It was only after the nightmarish final months of the Second World War, when American B-29 bombers reduced these wooden metropolises to smouldering ash, that Japan embraced concrete, glass and steel.

But luckily Japanese timber expertise did not vanish entirely, for it now appears wood is the future again. Late last year Sumitomo Forestry, a 300-year-old company, announced it was partnering with Kyoto University to design a surprising product: wooden satellites. This innovation aims to stop the dangerous build-up of space junk orbiting the Earth. The ultimate goal of the research, however, is back on terra firma, where Sumitomo hopes to design “ultra-strong, weather resistant wooden buildings”. It has already announced its ambitions to build a skyscraper more than 1,000 feet tall, constructed from 90 per cent wood, by 2041.

Could timber really be a major building material in the dense, vertical cities of the future? In fact, this possibility is well on the way to being realised. In recent years, architects and planners around the world have hailed the coming age of “mass timber”. This term refers to prefabricated wooden building components, such as cross-laminated timber, which can replace concrete and steel in large-scale construction.

Continue reading here.

“Euro-English”: A thought experiment

There was an interesting story in Politico last weekend about “Euro-English,” and a Swedish academic who wants to make it an official language. Marko Modiano, a professor at the University of Gävle, says the European Union should stop using British English for its documents and communications, and replace it with the bastardised English which is actually spoken in Brussels and on the continent more generally.

Politico offers this example of how Euro-English might sound, as spoken by someone at the European Commission: “Hello, I am coming from the EU. Since 3 years I have competences for language policy and today I will eventually assist at a trilogue on comitology.”

Although the EU likes to maintain the pretence of linguistic equality, English is in practice the lingua franca of its bureaucrats, the language in which most laws are drafted, and increasingly default language of translation for foreign missions. It is also the most common second language across the continent. But according to Modiano, this isn’t the same English used by native speakers, and it’s silly that the EU’s style guides try to make it conform to the latter. (Spare a thought for Ireland and Malta, who under Modiano’s plans would presumably have to conduct EU business in a slightly different form of English).

It’s a wonderful provocation, but could it also be a veiled political strategy? A distinctively continental English might be a way for the EU to cultivate a stronger pan-European identity, thus increasing its authority both in absolute terms and relative to national governments. The way Modiano presents his proposal certainly makes it sound like that: “Someone is going to have to step forward and say, ‘OK, let’s break our ties with the tyranny of British English and the tyranny of American English.’ And instead say… ‘This is our language.’” (My emphasis).

The EU has forever been struggling with the question of whether it can transcend the appeal of nation states and achieve a truly European consciousness. Adopting Euro-English as an official lingua franca might be a good start. After all, a similar process of linguistic standardisation was essential to the creation of the modern nation state itself.   

As Eric Hobsbawm writes in his classic survey of the late-19th and early-20th century, The Age of Empire, the invention of national languages was a deliberate ideological project, part of the effort to forge national identities out of culturally heterogeneous regions. Hobsbawm explains:

Linguistic nationalism was the creation of people who wrote and read, not of people who spoke. And the ‘national languages’ in which they discovered the essential character of their nations were, more often than not, artefacts, since they had to be compiled, standardized, homogenized and modernized for contemporary and literary use, out of the jigsaw puzzle of local or regional dialects which constituted non-literary languages as actually spoken. 

Perhaps the most remarkable example was the Zionist movement’s promotion of Hebrew, “a language which no Jews had used for ordinary purposes since the days of the Babylonian captivity, if then.”

Where this linguistic engineering succeeded, it was thanks to the expansion of state education and the white-collar professions. A codified national language, used in schools, the civil service and public communications like street signs, was an ideal tool for governments to instil a measure of unity and loyalty in their diverse and fragmented populations. This in turn created incentives for the emerging middle class to prefer an official language to their own vernaculars, since it gave access to careers and social status. 

Could the EU not pursue a similar strategy with Euro-English? There could a special department in Brussels tracking the way English is used by EU citizens on social media, and each year issuing an updated compendium on Euro-English. This emergent language, growing ever more distinctly European, could be mandated in schools, promoted through culture and in the media, and of course used for official EU business. Eventually the language would be different enough to be rebranded simply as “European.”

You’ll notice I’m being facetious now; obviously this would never work. Privileging one language over others would instantly galvanise the patriotism of EU member states, and give politicians a new terrain on which to defend national identity against Brussels. This is pretty much how things played out in multinational 19th century states such as Austria-Hungary, where linguistic hierarchies enflamed the nationalism of minority cultures. One can already see something like this in the longstanding French resentment against the informal dominance of English on the continent.

Conversely, Euro-English wouldn’t work because for Europe’s middle-classes and elites, the English language is a gateway not to Europe, but to the world. English is the language of global business and of American cultural output, and so is a prerequisite for membership of any affluent cosmopolitan milieu. 

And this, I think, is the valuable insight to be gained from thought experiments like the one suggested by Modiano. Whenever we try to imagine what the path to a truly European demos might look like, we always encounter these two quite different, almost contradictory obstacles. On the one hand, the structure of the EU seems to have frozen in place the role of the nation state as the rightful locus of imagined community and symbolic attachment. At the same time, among those who identify most strongly with the European project, many are ultimately universalist in their outlook, and unlikely to warm to anything that implies a distinctively European identity. 

How much is a high-status meme worth?

This article was published by Unherd on February 25th 2021.

Today one of the most prestigious institutions in the art world, the 250-year-old auction house Christie’s, is selling a collection of Instagram posts. Or in its own more reserved language, Christie’s is now “the first major auction house to offer a purely digital work.”

The work in question is “Everydays: The First 5000 Days” by the South Carolina-based animation artist Beeple (real name Mike Winkelmann), an assemblage of images he has posted online over the last thirteen-odd years. Whoever acquires “Everydays” won’t get a unique product — the image is a digital file which can be copied like any other. They’ll just be paying for a proof of ownership secured through the blockchain.

But more significant than the work’s format is its artistic content. Beeple is opening the way for the traditional art world to embrace internet memes. 

Continue reading here.

The double nightmare of the cat-lawyer

Analysing internet memes tends to be self-defeating: mostly their magic comes from a fleeting, blasé irony which makes you look like a fool if you try to pin it down. But sometimes a gem comes along that’s too good to let pass. Besides, the internet’s endless stream of found objects, jokes and observations are ultimately a kind of glorious collective artwork, somewhere between Dada collage and an epic poem composed by a lunatic. And like all artworks, this one has themes and motifs worth exploring.

Which brings me to cat-lawyer. The clip of the Texas attorney who, thanks to a visual filter, manages to take the form of a fluffy kitten in a Zoom court hearing, has gone superviral. The hapless attorney, Rod Ponton, claims he’s been contacted by news outlets around the world. “I always wanted to be famous for being a great lawyer,” he reflected, “now I’m famous for appearing in court as a cat.”

The video clearly recalls the similarly sensational case of Robert Kelly, the Korea expert whose study was invaded by his two young children during a live interview with the BBC. What makes both clips so funny is the pretence of public formality – already under strain in the video-call format, since people are really just smartly dressed in their homes – being punctured by the frivolity of childhood. Ridiculously, the victims try to maintain a sense of decorum. The punctilious Kelly ignores his rampaging infants and mumbles an apology; the beleaguered Ponton, his saucer-like kitten’s eyes shifting nervously, insists he’s happy to continue the hearing (“I’m not a cat” he reassures the judge, a strong note of desperation in his voice).

These incidents don’t become so famous just because they’re funny, though. Like a lot of comedy, they offer a light-hearted, morally acceptable outlet for impulses that often appear in much darker forms. We are essentially relishing the humiliation of Ponton and Kelly, much as the roaming mobs of “cancel culture” relish the humiliation of their targets, but we expect the victims to recognise their own embarrassment as a public good. The thin line between such jovial mockery and the more malign search for scapegoats is suggested by the fact that people have actually tried to discredit both men. Kelly was criticised for how he handled his daughter during his ordeal, while journalists have dredged up old harassment allegations against Ponton.

But there are other reasons why, in the great collective fiction of internet life, cat-lawyer is an interesting character. As I’ve previously written at greater length, online culture carries a strong strain of the grotesque. The strange act of projecting the self into digital space, both liberating and anxiety-inducing, has spurred forms of expression that blur the boundaries of the human and of social identity. In this way, internet culture joins a long artistic tradition where surreal, monstrous or bizarre beings give voice to repressed aspects of the human imagination. Human/animal transformations like the cat-lawyer have always been a part of this motif.

Of course it’s probably safe to assume that Ponton’s children, and not Ponton himself, normally use the kitten filter. But childhood and adolescence are where we see the implications of the grotesque most clearly. Bodily transformation and animal characters are a staple of adolescent fiction, because teenagers tend to interpret them in light of their growing awareness of social boundaries, and of their own subjectivity. Incidentally, I remember having this response to a particularly cheesy series of pulp novels for teens called Animorphs. But the same ideas are being explored, whether playfully or disturbingly, in gothic classics like Frankenstein and the tales of E.T.A Hoffman, in the films of David Lynch, or indeed in the way people use filters and face-changing apps on social media. 

The cat-lawyer pushes these buttons too: his wonderful, mesmerising weirdness is a familiar expression of the grotesque. And this gels perfectly with the comedy of interrupted formality and humiliation. The guilty expression on his face makes it feels like he has, by appearing as a cat, accidentally exposed some embarrassing private fetish in the workplace. 

Perhaps the precedent this echoes most clearly is Kafka’s “Metamorphosis,” where the longsuffering salesman Gregor Samsa finds he has turned into an insect. Recall that Samsa’a family resents his transformation not just because he is ghastly, but because his ghastliness makes him useless in a world which demands respectability and professionalism. It is darkly absurd, but unsettling too: it awakens anxieties about the aspects of ourselves that we conceal from public view. 

The cat-lawyer’s ordeal is a similar kind of double nightmare: a surreal incident of transformation, an anxiety dream about being publicly exposed. Part of its appeal is that it lets us appreciate these strange resonances by cloaking them in humour. 

Gamestop: A Classic Robin Hood Tale

This article was first published by Unherd on 27th January 2021

During the past year, the chaotic forces of the digital revolution have broken into the world of finance, in the form of trading apps that allow ordinary punters free access to the stock market. These tools have given rise to social media communities where amateur traders exchange advice and, it now seems, organise to stick a finger in the eye of Wall Street.

The case of GameStop is the latest — and most stunning — example of this trend. Late last year, a confrontation developed between small traders and major Wall Street investors like Melvin Capital Management. As the former rushed in to buy what they viewed as undervalued GameStop shares, the latter saw it as a chance to adopt a short position, which means betting that GameStop’s share price would crash. But shorting is a risky strategy, since if the share price were in fact to keep rising, those betting against it would have to pay the difference.

And so the small investors, led by firebrands on the subreddit WallStreetBets, smelled blood. They continued driving up the price of GameStop shares in the hope of bankrupting the institutional investors. At the time of writing, the shares were hovering around the $300 mark — an increase of 1,400 per cent since January 12th. This week they have near-doubled on a daily basis.

According to Bloomberg, one in five stock trades are now made by such “retail investors,” as opposed to institutional investors such as hedge funds and insurance companies. As this army of new investors has gathered en masse in online chatrooms, notably Reddit and Stocktwits, it has produced strange effects in the markets. It’s no longer unusual to see the market value of small, obscure companies suddenly going through the roof thanks to co-ordinated speculation by amateur traders.

The WallStreetBets subreddit has become the centre of an intoxicating underdog narrative, a classic American romance of the little guy standing up to the corrupt and complacent system. “Hedge fund managers live in the past, and continue to look down upon the retail investors,” writes one influential user, “We can think and make decisions for ourselves, which scares the FUCK out of old school institutions and hedge funds.” Another posted a now-viral message to CNN, complaining that the news network was in cahoots with the institutional investors.

Of course, this is clearly a bubble of some kind, but it is also an a Hollywoodesque tale of plucky outsiders taking on the establishment — indeed, the guerrilla investors turned their attention to GameStop after learning that Michael Bury, the maverick trader immortalised in The Big Short, had invested in the company.

Whatever the outcome though, this episode has provided another striking illustration of how social media empowers the anti-establishment strain in American culture, and turns defiance of the powers-that-be into a source of shared purpose.

America has entered a new era in which narratives generated on social media have become a profoundly destabilising force, as we witnessed when Donald Trump’s elaborately outfitted supporters stormed the Washington Capitol at the start of this year. The GameStop saga isn’t an exact parallel, but is also a force for disruption.

A deep-seated suspicion of authority combined with technology that empowers self-proclaimed rebels against the system is proving to be a potent cocktail.

Gambling on technocrats

The likely appointment of Mario Draghi as Italy’s prime minister has been widely, if nervously, greeted as a necessary step. Draghi, an esteemed economist and central banker, will be the fourth unelected technocrat to fill the post in Italy in the last 30 years. As the Guardian concedes by way of welcoming Draghi’s appointment, a ready embrace of unelected leaders is “not a good look for any self-respecting democracy.” 

Italy’s resort to temporary “technical governments” reflects the fact that its fractious political system, with its multitude of parties and short-lived coalitions, is vulnerable to paralysis at moments of crisis. Such has been the price for a constitution designed to prevent the rise of another Mussolini. Ironically though, the convention of installing technocrats recalls the constitutional role of Dictator in the ancient Roman Republic: a trusted leader who, by consensus among the political class, takes charge for a limited term during emergencies.

During the 1990s, it was the crisis of the European Exchange Rate Mechanism, the vast Mani pulite corruption scandal, and Silvio Berlusconi’s first chaotic administration which formed the backdrop for the technocratic governments of Carlo Ciampi and Lamberto Dini. Now in the midst of a pandemic and a gathering economic storm, the immediate pretext comes from the collapse of a government led by Giuseppe Conte of the Five Star Movement, amid machinations by Conte’s rivals and accusations of EU emergency funds being deployed for political patronage

Yet despite its distinctively Italian flavour, this tradition of the technocratic dictator has a much wider European resonance. It reflects the economic and political strains of European integration. And ultimately, the Italian case merely offers a pronounced example of the precarious interplay between depoliticised technocratic governance and democracy which haunts the European Union at large.

The agendas of the Ciampi and Dini cabinets included politically sensitive reforms to state benefits and the public sector, with the purpose of rendering Italy fit for a European economy where Germany set the tune. This pattern was repeated much more emphatically when the next technocratic prime minister, the economist Mario Monti, served from 2011-13. Monti’s mission on behalf of Berlin and Brussels was to temper Italy’s sovereign debt crisis by overseeing harsh austerity measures. 

The legacy of that strategy was the rise of the Italian populism in the form of the Five Star Movement and, on the right, Matteo Salvini’s Lega Nord. Which brings us to another crucial piece of background for Draghi’s appointment this week. With Italian Euroscepticism making further advances during the disastrous first phase of the pandemic, it seems likely that were an election called now a rightwing coalition led by Salvini would take power.

For Italy’s financial and administrative class, that prospect is especially scary given how much the country’s stability now depends on support from the EU. It can be hoped that Draghi will calm the nerves of Italy’s northern creditors, and Germany especially, to pave the way for a much needed second instalment of the coronavirus relief fund. But while all the talk now is of spending and investment, Italy has a public debt worth 160% of GDP and rising, which is only sustainable thanks to the European Central Bank (ECB) continuing to buy its government bonds. It is surely a matter of time before further “structural reforms” are demanded of Italy. 

In other words, when the political parties aren’t up to it, technical governments do the dirty work of squeezing the Italy into the ever-tightening corset of the EU’s economic model. So this is not simply a pathology of Italian politics, but nor can it be characterised as an imposition. Figures like Monti and Draghi have long been invested in this arrangement: they cut their teeth during the 1990s hammering Italian finances into shape for entry to the Euro, and subsequently held important posts in EU institutions. 

Indeed, the basic logic at work here, whereby tasks considered too difficult for democratic politics are handed over to the realm of technocratic expertise, has become a deeply European one. We see it most clearly in the EU’s increasing reliance on the monetary instruments of the ECB as the only acceptable tool with which to respond to economic crises. This goes back to the original political failure of not achieving fiscal integration in the Eurozone, which would have allowed wealth transfers to ailing economies no longer able to negotiate debt reductions or devalue their currencies. But during the Eurozone crisis and its aftermath, politicians avoided confronting their electorates with the need to provide funds for the stricken Club Med states. In stead they relied on the ECB to keep national governments solvent through sovereign bond purchases.

And lest we forget, it was these same bond purchases that made the name of Italy’s incoming prime minister, Mario Draghi. In 2012, when Draghi was ECB president, he appeared to almost magically calm the debt markets by announcing he would do “whatever it takes” to keep the Eurozone afloat. This statement, revealing that Draghi had been empowered to step outside the bounds of rule and precedent, is again suggestive of a kind of constitutionally-mandated technocratic dictator, but at a Europe-wide level. 

Of course to focus on monetary policy is also to highlight that these tensions between technocracy and democracy go far beyond the EU. It is certainly not just in Europe that central bankers have accrued vast power through their ability to provide back-door stimulus and keep huge debt burdens sustainable. The growing importance of central banks points back to an earlier moment of depoliticisation at the dawn of neoliberalism in the early 1980s, when control of interest rates was removed from the realm of democratic politics. More fundamentally, it points to the limitations imposed on democracy by the power of financial markets. 

Still, it is no accident that this tension has appeared in such acute form in the EU. As with Italy’s ready supply of emergency prime ministers, the EU’s dense canopy of technocratic institutions provides an irresistible way for politicians to pass the buck on issues they would otherwise have to subject to democratic conflict. This is all well and good if the technocrats succeed, but as we have seen recently with the EU’s vaccine program, it also raises the stakes of failure. Handing difficult and sensitive matters over to unaccountable administrators means that blame and resentment will be directed against the system as whole. 

Why accusations of vaccine nationalism miss the mark

This article was first published by The Critic magazine on 2nd February 2021.

n the wake of Friday’s decision by the European Union to introduce controls on vaccine exports, there has once again been much alarm about “vaccine nationalism.”  This term is meant to pour scorn on governments that prioritise their own citizens’ access to vaccines over that of other countries. It points to the danger that richer parts of the world will squabble for first dibs on limited vaccine supplies – “fighting over the cake,” as a World Health Organisation official aptly described it – while leaving poorer countries trailing far behind in their vaccination efforts.

Certainly, there’s a real danger that the EU’s export controls will end up hampering overall vaccine production by sparking a trade war over raw materials. This is somewhat ironic, given that few have been as outspoken about countries “unduly restricting access to vaccines” as the EU itself. As for global inequalities in vaccine access, make no mistake – they are shaping up to be very ugly indeed. It looks likely that poorer countries, having already faced an economic, social, and public health catastrophe, will struggle to vaccinate their most vulnerable citizens even as richer states give jabs to the majority of their populations.

Wealthy nations undoubtedly have a moral obligation to minimize the impact of these disparities. Nonetheless, wielding vaccine nationalism as a pejorative term is an unhelpful way to diagnose or even to address this problem. Given how the world is structured politically, the best way to ensure that vaccines reach poorer countries is for richer ones to vaccinate a critical mass of their own citizens as quickly as possible.

To condemn vaccine nationalism is to imply that, in the early summer of 2020 when governments began bidding for Advance Purchase Agreements with pharmaceutical companies, a more cooperative global approach would have been feasible. In reality, the political, bureaucratic and logistical structures to meet such a challenge did not exist. Some are still pointing to Covax, the consortium of institutions trying to facilitate global vaccine equality, as a path not taken. But Covax’s proposed strategy was neither realistic nor effective.

The bottom line here is that for governments around the world, whether democratic or not, legitimacy and political stability depends on protecting the welfare of their citizens – a basic principle that even critics of vaccine nationalism struggle to deny. Only slightly less important are the social unrest and geopolitical setbacks that states anticipate if they fall behind in the race to get economies back up and running.

In light of these pressures, Covax never stood a chance. Its task of forging agreement between an array of national, international and commercial players was bound to be difficult, and no state which had the industrial capacity or market access to secure its own vaccines could have afforded to wait and see if it would work. To meet Covax’s aim of vaccinating 20 per cent of the population in every country at the same speed, nations with the infrastructure to deliver vaccines would have had to wait for those that lacked it. They would have surrendered responsibility for the sensitive task of selecting and securing the best vaccines from among the multitude of candidates. (As late as November last year Covax had just nine vaccines in its putative global portfolio; it did not reach a deal with the first successful candidate, Pfizer-BioNTech, until mid-January).

But even if a more equitable approach to global vaccine distribution had been plausible, it wouldn’t necessarily have been more desirable. Seeing some states race ahead in the vaccine race is unsettling, but at least countries with the capacity to roll out vaccines are using it, and just as important, we are getting crucial information about how to organise vaccination campaigns from a range of different models. The peculiarity of the vaccine challenge means that, in the long run, having a few nations to serve as laboratories will probably prove more useful to everyone than a more monolithic approach that prioritises equality above all.

The EU’s experience is instructive here. Given its fraught internal politics, it really had no choice but to adopt a collective approach for its 27 member states. To do otherwise would have left less fortunate member states open to offers from Russia and China. Still, the many obstacles and delays it has faced – ultimately driving it to impose its export controls – are illustrative of the costs imposed by coordination. Nor should we overlook the fact that its newfound urgency has come from the example of more successful strategies in Israel, the United States and United Kingdom.

Obviously, richer states should be helping Covax build up its financial and logistical resources as well as ensuring their own populations are vaccinated. Many are doing so already. What is still lacking are the vaccines themselves. Since wealthy states acting alone have been able to order in advance from multiple sources, they have gained access to an estimated 800 million surplus vaccine doses, or more than two billion when options are taken into account.

There’s no denying that if such hoarding continues in the medium-term, it will constitute an enormous moral failing. But rather than condemning governments for having favoured their own citizens in this way, we should focus on how that surplus can reach poorer parts of the world as quickly as possible.

This means, first, scaling up manufacturing to ease the supply bottlenecks which are making governments unsure of their vaccine supply. Most importantly though, it means concentrating on how nations that do have access to vaccines can most efficiently get them into people’s arms. The sooner they can see an end to the pandemic in sight, the sooner they can begin seriously diverting vaccines elsewhere. Obviously this will also require resolving the disputes sparked by the EU’s export controls, if necessary by other nations donating vaccines to the EU.

But we also need to have an urgent discussion about when exactly nations should stop prioritising their citizens. Governments should be pressured to state under what conditions they will deem their vaccine supply sufficient to focus on global redistribution. Personally, not being in a high-risk category, I would like to see a vaccine reach vulnerable people in other countries before it reaches me. Admittedly the parameters of this decision are not yet fully in view, with new strains emerging and the nature of herd immunity still unclear. But it would be a more productive problem to focus our attention on than the issue of vaccine nationalism as such.