London: Zombie Capital

This essay appeared in my regular newsletter, The Pathos of Things, in November 2023. Subscribe here

In a world of mass media, we are ruled by the tyranny of comparison. We are surrounded by images of beauty and style, of fulfilment and success, that make us feel inadequate by contrast. Sometimes it is even an image of ourselves, a glimpse of what we once were or could have been, that we yearn to emulate.

What if something similar can happen to a city? Could an entire metropolis be oppressed by an idealised version of itself, as seen in films, advertising, and the imagination of the wider world? This seems to be the case in London today, and no doubt in other famous cities too. A caricature of the British capital, part period drama and part Richard Curtis romcom, has been sold to a global audience of nostalgic Anglophiles. And because London is where the UK welcomes with the world, all the other twee fantasies of Englishness are expected here as well. Increasingly, the city is being warped under the pressure of its own whimsical image.

Consider the Londoner, a new themed resort in Macau, off the south coast of China. Visitors are treated to every English cliché imaginable, from Scotch eggs and scones to David Beckham and the Spice Girls, not to mention replicas of various London landmarks. A correspondent for the Times describes it thusly:

As the actor playing Her Majesty waves demurely from a balcony, Grenadier Guards and a few Metropolitan Police bobbies dance to fanfare all around the Crystal Palace — a glass-topped atrium inspired by the building which once adorned Hyde Park. All the while, hundreds of people delightedly video this twice-daily performance… from either side of a replica of Eros from Piccadilly Circus.

I make pretend calls from red telephone boxes before boarding an imported, 1966 Routemaster bus. Looming near that is a full-size duplicate of the Elizabeth Tower, aka the home of Big Ben; behind, the building’s intricate lower façade apes the Houses of Parliament. Airport transfers involve vintage Rolls-Royces.

An obvious farce, yes, but a farce that many people love. Besides, the Londoner is far from the only simulacrum of British culture in China. Imitations of prestigious public schools and colleges have been cropping up around the country. One university in Hebei is modelled on Hogwarts.

The TV and film industry are of course the greatest purveyors of fantasy Britannia. Much as western audiences like exotic portrayals of the East, foreign audiences like old-fashioned portraits of England and its capital city, normally involving the upper class. Global hits include Downton Abbey and The Crown, Harry Potter and James Bond, and of course Titanic and Notting Hill. Arriving in Shanghai two years ago, the first person I spoke to referenced the 1980s comedy Yes Minister, which seemed rather niche but still proves the point I think.

It just so happens that many Europeans, Brits included, like this picture of Englishness too, but the enormous markets beyond Europe would be enough to justify it in commercial terms. As the FT’s Stephen Bush puts it, “left to the market alone, the UK depicted on screen will look rather like the India presented by The Best Exotic Marigold Hotel: not a country that any of its citizens would readily recognise, but one that reflects foreign customers’ idea of it.”

What goes for the screen increasingly goes for London itself, as the city is transformed to realise the commercial potential of its brand. Its historic centre has become a theme park rather like the one in Macau, catering to tourism, shopping and hospitality to the extent that it no longer really seems to belong to the city. Most Londoners can’t afford a drink in these areas, let alone to live or run a business there. The capital’s most famous monuments, including the buildings where the country is governed, are so tied up with marketing and merchandise that it’s sometimes surprising to be reminded they are real places.

The irony, of course, is that much of London’s living history – such as the independent bars and shops which thrived in Soho even a decade ago – has been strangled by this process. It is being replaced by something I’ve previously called zombie heritage, whereby the city’s architecture comes to resemble a series of waxworks recreating an older London for commercial purposes. In many cases, such as the revamped Battersea Power Station (luxury apartments and shopping) or the recent redevelopment of Kings Cross (prestigious art school, corporate offices and shopping), these new “old” places have essentially been designed as giant magnets for global wealth.

And make no mistake, there is money to be made from Anglomania. In 2016, an academic at the London School of Economics estimated that the Harry Potter franchise generated £4 billion for London in that year alone. The question is whether such returns can justify, in economic or social terms, surrendering much of the capital to tourism and novelty consumption.

In a recent essay, Deyan Sudjic has also noted the “creeping fossilisation” of London, as the city’s productive capacities are crowded out by the hawking of heritage. Sudjic acidly observes “the Old War Office building on Whitehall, from which Winston Churchill led the defence of Britain against Hitler, has become the Raffles OWO hotel where bed and breakfast starts at over £1,000 a night.” Meanwhile in Camden, “facsimile punks” perform for tourists “as if they were Beefeaters on parade.”

If this trend continues, encouraged by white-collar professionals evacuating the centre to work remotely, then London’s old urban core will eventually resemble a cross between Austin Powers and Hogwarts: a moribund museum of British kitsch, stretching from Shepherd’s Bush to Hackney, from Kentish Town to Southwark. By way of warning, Sudjic cites Venice, the infamous case of a city frozen in the form of a luxury destination. Maybe it’s already too late, for right next to the Londoner in Macao is another hotel and shopping complex modelled on a famous city: the Venetian.

Food: It’s Complicated

This essay appeared in my regular newsletter, The Pathos of Things, in November 2023. Subscribe here

At some point in the last few years, I gave up on food. I didn’t stop eating food, of course, or even enjoying it, but I stopped aspiring to any kind of skill or refinement in the matter of what enters my stomach. I enjoy a meal in the same way I enjoy a pint of Guinness or a walk: a simple, ritual pleasure. I am not trying to master a craft, scale an aesthetic peak or broaden my horizons. Most weeks I eat the same thing more or less every day (since you asked: pasta with aubergine, or fish with rice).

I sometimes wonder what this loss of culinary ambition says about me. When a man tires of food, is he tired of life? Should we not seek beauty in gustation as we do in the other senses? Is this how I begin my transition into an incurious bore who likes things a certain way, and no different?

Ok, maybe these questions aren’t very interesting. But I do find it curious that they can be asked at all: that such profound meanings can be looked for in the digestive tract. For the vast majority of people, there used to be just one salient question relating to food: do I have any? Now, thanks to modern agricultural productivity, that single pressing concern has been replaced by a range of often neurotic issues. Am I eating too much? How healthy is my diet? Can I impress people with my cooking? Am I bored of my usual fare? Am I causing deforestation? Have any animals been tortured for my weekly shop? How badly?

Food is bound to be complex because, to state the obvious, it is something every one of us puts down our throat on a daily basis. This makes it rather personal, but also ensures that it will be caught up in all sorts of social and moral considerations. Our attitudes to food are shaped not just by the privacy of the palate, but by the social aspects of the plate: by family, religion and class, not to mention beauty standards.

From a design perspective, food could be compared with clothes and buildings. These are all things people need to survive, which have also become mediums of cultural expression and identity. In their traditional forms, cuisine, fashion and architecture are vernacular: they have local accents, reflecting the history and conditions of a particular region. In their modern forms, all three have been transformed by mass production, on the one hand, and by notions of artistic creativity on the other. To oversimplify, the journey from homespun clothes to Nike and Alexander McQueen – or from traditional building methods to concrete, steel and Zaha Hadid – is echoed by the journey from local cuisines to McDonalds, Tesco and deconstructed cheesecakes.

To a far greater extent than buildings or clothes though, modern food continues to invoke an idealized image of its older, traditional forms. Authenticity and heritage are appealing concepts in many industries, but nowhere are they promoted as widely or enthusiastically as in gastronomy. Whether describing flavours and ingredients or the agricultural practices that produce them, the marketing of food skews heavily towards ideas of the time-honored, the local, the artisanal, the small-scale, and above all the natural or organic. Economists may classify food as manufacturing – the UK’s largest manufacturing sector, in fact – but the rest of us prefer not to.

There are all kinds of illusions at work here, some of which George Monbiot has tackled in a recent essay. In reality, traditional peasant diets were both meagre and dull. Places worshipped for their agricultural heritage have, in most cases, been transformed by tourism and commercial farming. “In the famous cheesemaking regions of France,” Monbiot writes,

you will scarcely see a dairy cow. Instead, vast tracts are cultivated for maize… to feed the cattle stalled in the vast steel sheds – cow factories – that have sprung up from Brittany to Savoie, a business as brutal and industrial as any other. Milk is trucked across hundreds of kilometres, trade fairs market the cheese from Dubai to Shanghai.

And the more industrial the reality, the more romantic the advertising, with its “close-ups of cracked and dirt-grained hands, chickens clucking through buttercupped meadows, girls in Heidi costumes and all the other autophagous nonsense of the Spectacle.”

But why does food demand this veil of nostalgia more than other, similarly brutal industries? We hardly expect fashion retailers to pretend their clothes are woven on handlooms, or property developers to make as though they are employing Gothic stonemasons. This surely comes back to the intimate dimension of food. The scary aspects of mass production – the chemicals, the pillaging of nature, the inhuman scale and the indifference to suffering – are that much more scary when they are directly connected to our body and the innocent comfort of eating. When it comes to the gut, our feelings are just more, well, visceral.

This is why the aesthetics, the emotional powers of imagery, play such an important role. I am probably less sentimental about animal welfare than the average British person, but I was revolted by photographs of the high-rise pig farms which have appeared in numerous Chinese cities. These buildings are enormous, extremely plain, utterly utilitarian; even if the horrors inside them are sadly not unusual, their unmistakably industrial appearance made my stomach turn. After that, I longed for the pictures of dirt-grained hands and Heidi costumes.

There are other, more subtle reasons that authenticity and heritage are so valued in relation to food. As consumerism erodes the traditional core of culture, it creates the idea of heritage as a way for tradition itself to be consumed. How do we consume it? Via the mouth, of course. And in the era of multiculturalism, there is plenty of heritage to sample. People from around the world can bring their identity to the marketplace in the form of cuisine, and customers can enjoy not just the variety and novelty, but the satisfaction of being the open-minded, cosmopolitan sort. Such transactions rely on the pretense that food somehow contains the authentic essence of a culture. 

But maybe all these fictions are only possible because, while food may be just another industrial product, there is still something ancient and hallowed in the act of eating together. The shared meal, with its endless variety of settings and formats, is a rare survival from a much older world, and one of the few practices that retains the aura of a sacred custom. The conversion of countless British churches into cafés in recent decades could be taken as a metaphor: meeting and eating has a central place in what remains of associational life.

As for my relationship with food, the more boring I become in my everyday choices, the more I appreciate the role of a good meal in marking an occasion, grand or simple, elegant or trashy. At these moments, food becomes something different, something more than nourishment or satisfaction. It becomes an expression of joy.

The Weird World of Cruise Ships

This essay appeared in my regular newsletter, The Pathos of Things, in August 2023. Subscribe here.

I have never been on a cruise ship, and probably never will. To be trapped in a crowded tourist resort miles from the nearest land is relatively close to my idea of hell. Yes, with enough alcohol and the right company, I can imagine it would be fun for a day or two (readers are of course welcome to sponsor a research trip). A week would probably drive me to the edge of insanity.

Still, I find these seaborne cities fascinating, enchanting even. If you boiled the spirit of the modern hospitality business down to its purest concentrate, you surely get something like the Icon of the Seas, the enormous new cruise ship whose digital renderings have recently gone viral online. Imagine Hieronymus Bosch painting Disneyland, and you get some sense of the diabolical energy of this project. I can think of no other artefact that employs design, engineering and industrial capacity on this scale for the sole purpose of escapism and pleasure.    

When the Icon of the Seas begins roaming the Caribbean in January, its 7,500 passengers will have access to on-board water parks, surf simulators and climbing walls, along with the usual menagerie of bars, restaurants, cinemas, theatres and gyms, all ranged over twenty decks. This represents a pinnacle of luxury consumption and of travel as an organised departure from reality (cruise ships have no worldly destination, sailing in giant circles back to their origin). It should come as no surprise that the recently deceased Italian prime minister and media mogul Silvio Berlusconi, who pioneered the contemporary mode of politics as entertainment, started out as a singer on cruise lines.

While a large majority of passengers once came from North America, cruise holidays are now wildly popular around the world, visiting not just the Caribbean but the Mediterranean, Baltic, North Pacific and South China Sea. Passenger numbers are climbing back towards their pre-pandemic peak of almost thirty million per year.

The origins of this institution lie with the great ocean liners of the late-19th and early-20th centuries. These were already floating fantasy habitats, offering first class passengers much more than the luxuries portrayed in James Cameron’s Titanic. As Thomas Kepler writes, the most extravagant German liners boasted “Louis XIV and Moorish lounges, winter gardens, and restaurants… sometimes two or three decks in height, gilded to the hilt with Wagnerian kitsch.” The Imperator, launched in 1912, had a first-class smoking room “in the guise of a Bavarian hunting lodge,” as well as a neoclassical swimming pool called the Pompeian Bath. The White Star Line’s Olympic (sister ship to the Titanic) boasted a Turkish bath, a gymnasium and squash courts.

With the emergence of modern cruise ships in the 1960s, some of them converted ocean liners, such seaborne decadence gradually became available to a prosperous American middle class. One difference was that ocean liners were a means of transport, not simply indulgence, so their aesthetics were partly designed to distract from rough seas.

In the intervening decades, cruise ships have become such enormous and complex artefacts that very few shipyards are capable of building the biggest specimens. Just four in fact, of which three, interestingly, are in mainland Europe. But you shouldn’t imagine the ships being built from scratch in a single place. Construction is now highly modular, with up to eighty separate “blocks” – themselves composed of sub-blocks – being built simultaneously at different locations, complete with electrics, plumbing and furniture, before they are all slotted together. The result is an incredible concentration of different technologies, from gourmet kitchens to discotheques, in a single designed object.

The reason cruise ships have become so huge is largely financial: more passengers, more cash flow and fewer overheads. Since speed is no longer a priority, the elegant profile of the ocean liner has given way to the floating tub, maximising space for people and attractions. But reaping these benefits requires vast upfront capital investments and technical expertise, which means fewer and fewer firms can compete. The same dynamics can be found in many fields of industrial design; reading about the cruise ship industry is, strangely, quite similar to reading about the semiconductor industry.  

As a form of architecture, the cruise ship might recall Robert Venturi and Denise Scott Brown’s celebration of Las Vegas as a democratic landscape of fun, as meaning for the masses. But we could equally look in the opposite direction from these postmodern theorists, to the great Modernist ideologue Le Corbusier.

The Unité d’Habitation, Le Corbusier’s famous 1952 housing block in Marseilles, bears many striking similarities to the cruise ship. It too uses a modular structure of identical apartment units, fitted like cells into a concrete frame. In terms of overall form, it certainly resembles something modern civilization would put on the sea. What is more, in keeping with the Modernist vision of the self-contained community, the Unité incorporates a range of leisure facilities, including a shopping arcade, running track, gymnasium and rooftop paddling pool.

What to make of these echoes? It is tempting to see the cruise ship as a kind of temporary reprieve from suburbanisation. Around the world, middle class families have spurned cities for larger, more dispersed dwellings, which presumably makes an occasional intense shot of communal experience quite attractive. Maybe the cruise holiday is, like the Unité d’Habitation, a form of utopian urbanism: a rare opportunity for designers to create a dense, cosmopolitan environment that is genuinely popular.

The lesson for urban planners? Free cocktails make a big difference.

Space architecture: a moonage daydream?

This essay was originally published by Engelsberg Ideas in January 2024.

In January, the architecture studio Hassell published designs for a settlement to house 144 people on the moon. Commissioned by the European Space Agency (ESA), this ‘Lunar Habitat Master Plan’ shows a series of inflatable pods in which settlers will live, eat, exercise and cultivate plants. A protective shell of lunar soil, to be 3D-printed on site, will shield the structures from devastating levels of radiation on the moon’s surface. Nor will this life be without leisure and style. Hassell’s renderings include a cocktail bar with elegant coffee tables, atmospheric lighting, and moulded chairs that carefully echo the imagery of science fiction.

This proposal is just the latest in a growing genre of architectural projects for sites beyond Earth. The last decade has seen a flurry of eye-catching designs for both the moon and Mars. Hassell has previously imagined something even more swish than its lunar cocktail bar: a Martian abode with timber-effect flooring, houseplants and minimalist furniture. It is a vision fit for an IKEA advert, down to the young couple whose relaxing evening we can glimpse through the generous picture window. Meanwhile, NASA has enlisted architects from two studios, Bjarke Ingels Group and SEArch +, to work on lunar structures. Both firms have already been involved in space-related projects, with the latter proposing a Martian community living in igloos of sculpted ice.

Another idea for Mars comes from the high-profile Foster + Partners: mound-like dwellings of fused rubble, assembled by teams of robots arriving via parachute. Perhaps the most ambitious concept, courtesy of the architects at Abiboo, imagines a vast rabbit warren of tunnels embedded in a Martian cliff-face, containing a metropolis of 250,000 people.

I could go on, but it should already be apparent that this burgeoning field of space architecture involves considerably more fantasy than concrete planning. The problem is not necessarily a lack of detail: many projects indulge in technical discussion of materials, construction methods and service systems. But given that the longest habitation of the moon to date was the three days the Apollo 17 crew spent camping in their rover in 1972, while no person has ever set foot on Mars, it is clear these futuristic structures and fashionable interiors really belong to the realm of science fiction. We shouldn’t be surprised that the winners of one NASA competition for a Mars base also want to‘harness the power of comets for interplanetary transportation’, or that Abiboo’s Martian city proposal requires steel-making technology ‘that will need to be developed’.

So what exactly is the point of these designs, and why are agencies such as NASA and the ESA bothering to commission them? Ultimately, speculative space projects tell us more about architecture as a practice and an industry here on Earth than they do about future settlements on distant celestial bodies. This is not to say that such schemes will never bear fruit, but such fruits are likely to emerge much closer to home, as part of architecture’s ongoing fascination with the idea of space.

The notion of lunar or Martian architecture is not necessarily absurd. We are on the cusp of a new Space Age, and this time the aim is not just to visit other parts of the solar system, but to establish a lasting presence there. The first destination is the south pole of the moon, where there are craters containing frozen water. Last August, India’s Chandrayaan-3 mission landed an unmanned spacecraft tere for the first time. The main players, however, are the United States and China, who have both assembled international coalitions for space exploration. NASA’s Artemis program, in cooperation with more than 30 nations, hopes to have astronauts at the lunar pole by the end of the decade, and built structures in place by 2040. Unlike the earlier Apollo missions, Artemis can draw on a buoyant private-sector space industry, including rockets and spacecraft designed by Elon Musk’s SpaceX. Musk has claimed that he is amassing his vast personal wealth in order to make humanity a ‘multi-planetary species’.

Meanwhile, on a similar timetable, China’s Chang’e program is hoping to establish its own International Lunar Research Station, with partners including Russia, Egypt and Pakistan. Both the American- and Chinese-led efforts have ostensibly scientific aims. The moon promises a wealth of information about the distant past of the solar system, including the formation of Earth itself, but it would be naïve to imagine this new space race is about a disinterested search for knowledge. The moon is expected to furnish valuable supplies of Helium-3, potentially useful for nuclear fusion, as well as rare earth metals. More importantly, the hydrogen and oxygen at the moon’s south pole can be used for rocket propellant, allowing space travellers to continue on to further destinations. This is no longer just idle speculation: all parties now see the moon as a stepping-stone towards the future colonisation of Mars.

The problem is that building anything in these distant places, let alone living there, involves enormous challenges of engineering and logistics. Lunar pioneers will need to endure temperature swings of hundreds of degrees Celsius between day and night, along with constant bombardment by micrometeorites. On Mars, they can look forward to average temperatures of -60C and frequent deadly sandstorms. Both places are subject to intense cancer-causing radiation, and a total absence of soil suitable for growing food. Even breathable air needs to be manufactured.

Space agencies are investigating potential infrastructure for a moon base, including water extraction, satellites for communication, and electricity from solar and nuclear plants. The difficulty of transporting heavy materials such a long way, however, means that any construction will have to make use of the sparse elements already in situ, especially ice, water, and the rubble known as regolith. There will be no workforce, so structures will need to be built robotically or by the astronauts themselves. Before we worry about the aesthetics of the lunar hospitality industry, engineers face a more basic question: can we even make bricks in these conditions? The firm that NASA is hoping will develop its construction technology, the 3D-printing specialists ICON, has only been receiving funding since 2020.

Thinking ahead can be valuable. There is little use developing construction techniques if we don’t have some sense of how we want to employ them. By visualising the endpoint, architecture can help to specify the problems that the engineers need to solve. Besides, living on the moon for an extended period will require much more than engineering. According to architect Hugh Broughton, whose work on polar research bases is providing inspiration for NASA, the deeper problem is ‘how architecture can respond to the human condition’. When designing for extreme conditions, the architect has to consider ‘how you deal with isolation, how you create a sense of community… how you support people in the darkness’. This is worth bearing in mind when we see lunar proposals that resemble a leisure centre or cruise ship. Research bases in the Antarctic include amenities such as cinemas and themed bars. Their purpose is, above all, to provide a sense of familiarity. The German firm Duravit has even considered this psychological need in its design for a Martian toilet, which resembles an earthbound one despite working in a different way.

Nonetheless, there remains an indulgent quality to recent space designs. The Ukrainian studio Makhno has envisaged a Martian settlement along the lines of a luxury spa, complete with swimming pools and bucolic rock gardens. It even has a meditation capsule for ‘recovery, restart of consciousness, and immersion into the inner’. No doubt there is publicity value for architects in these imaginative projects – like the concept cars that automakers show at fairs – but this then raises the question of what virtues architects are trying to showcase, and why space is the appropriate medium.

Decades ago, the architect and critic Kenneth Frampton noted a tendency in the avant-garde to imagine impossible projects, which he diagnosed ‘as the return of a repressed creativity, as the implosion of utopia upon itself’. Frampton was pointing to the tension between the ideals and criteria of excellence that animate modern architecture and the highly constrained reality in which architects actually operate. Architecture aspires to engage with deep problems of social life, and also to achieve aesthetic or technical brilliance. Yet, across much of the developed world, innovative approaches to the built environment are stifled by all manner of restraints, from restrictive planning codes to the profit margins of property developers. There may be more scope for originality when it comes to designing fancy hotels, art museums and corporate office buildings, but such displays tend to make architecture the handmaiden of business or entertainment.

Following Frampton’s critique, we could see space architecture as a means to demonstrate, in the realm of imagery, the ambition and purpose that are so rarely possible in real buildings. Designing for the moon or Mars offers not just the grandeur of a new frontier in human history, but the creative freedom of a genre whose boundaries are yet to be established, and which is in any case largely speculative.

More particularly, these projects allow a kind of imaginative return to the heroic age of modern architecture, the 1920s and 30s. In the turmoil of Europe following the First World War, for a short while it appeared that the designers of buildings and cities could shape the future of humanity. They felt emboldened to break with the past and develop rational, efficient and functional answers to the problems of modern society. The results ranged from the visionary doctrines of Le Corbusier to the granular ingenuity of a figure such as Grete Schütte-Lihotzky, designer of the galley kitchen, whose basic template we still use today. Space architecture provides a similar opportunity to address fundamental questions of design, from materials and form to the arrangement of functions in the overall plan, without the weight of convention obstructing the most elegant solution.

If architects can use space projects as an outlet for repressed creativity, their work serves a rather different purpose for the organisations that commission them. In an era when imagery carries immense power, digital renderings have become a political medium, called on to visualise the imagined marvels of the future. And space exploration is deeply political. It embroils the relevant agencies in a constant struggle for government funds, forcing them to confront public misgivings about the necessity and cost of their activities. Since the Soviet Union launched Sputnik, the first satellite, in 1957, such doubts have been countered in part through the imaginative appeal of the final frontier; an appeal that has only grown with the rise of visual media. In 2020, when NASA’s Perseverance rover explored the surface of Mars, social media users were thrilled to hear audio of Martian winds, and to see a horizon with Earth sparkling in the distance; that this particular photograph turned out to be fake only underscored the role of fantasy in these matters. Architectural imagery gives form to dreams of colonising the solar system. It thereby helps to justify space exploration not just to politicians, but to a wider audience of media consumers.

Space design reveals a peculiar alignment of interests between architects and their clients. The former can apply themselves to a heroic paper architecture – or rather, digital architecture – for which there is little scope on Earth; the latter, meanwhile, can justify their budgets with images that invoke an exciting horizon of possibility. It would be short sighted, however, to consider such projects only in terms of their immediate motives and plausibility.

The consequences of human engagement with space have always been dynamic and unpredictable. Technology provides the clearest examples: NASA laboratories have inadvertently contributed to all kinds of everyday products, from camera phones and baby formula to running shoes and wireless headsets. We can already see the potential for such transfers in the drive to build in other parts of the solar system. At the University of Nebraska, a team led by the engineer Congui Jin has been developing organic construction materials for Mars. Jin thinks that certain kinds of fungi can assemble minerals from Martian sand and grow them into bricks. If successful, such techniques could find numerous applications on Earth, starting with the repair of damaged concrete and provision of refugee housing in remote areas.

Architecture has its own story to tell about the role of space exploration in the modern world. When Kenneth Frampton made his comments about the frivolity of avant-garde ideas, he had in mind a group of students and municipal architects that appeared in Britain during the 1960s, known as Archigram. In their magazine of the same title, the group explored fantastical and futuristic schemes like the ‘Walking City’ (that is, a city with legs) and the ‘Underwater City,’ while devising all manner of pods, capsules and bubbles in which people might one day dwell. These fantasies were informed by the technology and aesthetics of the Space Age. As Archigram 3 put it, architecture needed something ‘to stand alongside the space capsules, computers and throw-away packages of an atomic/electronic age.’ The first issue had invoked ‘the poetry of countdown… [and] orbital helmets,’ while the second took inspiration from ‘Lunar architecture and shredded wheat… the radiator grille and the launching pad.’

Archigram’s provocations were often more outlandish than the space habitats of recent years. And yet, the group’s influence has been profound. Their disciples include leading practitioners of high-tech architecture, such as Norman Foster, Renzo Piano and the now-deceased Richard Rogers, who have designed many of the world’s most notable buildings over the past half-century. While Archigram dreamed of structures that would capture the ethos of the space age, these architects have designed and built them, often using materials and methods borrowed from the aerospace industry. We can even detect Archigram’s spirit in the inflatable pods that feature in many recent proposals for the moon and Mars.

Strangely then, space architecture is not really the fresh departure it appears to be. When we look at new schemes for settlements beyond Earth, we are seeing a long-extant futuristic sensibility that is now straining closer towards material reality. By the same token, even if we don’t see cocktail bars on the moon anytime soon, the ideas stimulated by such projects may still prove influential for the future of design.

The Lost Art of Leisure

This essay was published by the New Statesman in May 2023, under the headline “You Should Only Work Four Hours a Day.”

Decades ago, Roland Barthes quipped that “one is a writer as Louis XIV was king, even on the toilet”. He was mocking the way literary types like to distinguish themselves from the mass of working people. According to Barthes, writers insist that their productive activities are not limited to any time and place, but flow constantly like an “involuntary secretion”.

Well, we are all writers now, at least in this sense. Stealing a few holiday hours to work on an article used to be my party trick. Now I find that, on Mondays and Fridays when many office buildings stand empty, my salaried comrades are sending emails from an Airbnb somewhere. Come the weekend, they might close their laptops, but they don’t stop checking their phones.

Of course this hardly compares with the instability further down the pay scale. Around one in seven British workers now do gig-economy jobs like Uber or Amazon delivery at least once a week, according to research for the Trades Union Congress, many of them on top of full-time employment.

Work today is fluid, overflowing its traditional boundaries and seeping into new domains. Meditation and exercise look suspiciously like personal optimisation. Artistic vocations centre on tireless self-promotion to a virtual audience. A movement of “homesteaders” churning their own butter and knitting their own jumpers are simply cosplaying older forms of work, and probably posting the results on Instagram.

With the help of our digital tools, we are adapting ourselves to productivity as involuntary secretion. The result is an evisceration of personal life and an epidemic of burnout.

Our diffuse working culture has attracted plenty of critiques. The problem is most of them share the basic outlook that enabled the spread of work to begin with. Should we recognise “quiet quitting” as a justified response to unreasonable demands by employers? Is rest a form of “resistance”? Do we all just need a better “work-life balance”? These arguments present life as a two-way split between work and some nondescript realm of personal freedom, the question being how we can reclaim time from one for the sake of the other.

As long as the alternative to work remains just a negative space, work will continue leaching into it. What we are missing is a real counterbalance: a positive vision of leisure.

Properly speaking, leisure is not rest or entertainment, though it can provide both. It is not mere fun, though it ought to be satisfying. Its forms change over time, but it generally involves elements of play, fantasy and connection with other people or the natural world. Most importantly, leisure is superfluous to our worldly needs and ambitions: something we do not as a means to any end, but simply for its own sake.

Truly mass participation in leisure was a striking feature of British life in the early 20th century. People played in brass bands and raced pigeons. They learned to dance and performed in plays and choirs. In 1926 nearly 4,000 working-class anglers from Birmingham took part in a single fishing competition along 20-odd miles of river. During the 1930s, as the historian Ross McKibbin writes, “one of the great sights of the English weekend were the fleets of cyclists riding countrywards along the arterial roads of the major towns”.

People still do these things, of course, but they do them as hobbies. The hobby belongs to a culture defined by work: it is a creature of downtime and a quirk of character. Hobbies rely on individual enthusiasm, so they often collapse in the face of stress or time pressure. Besides, we tend to judge them by the unleisurely criteria of self-improvement. Physical and intellectual pursuits are admirable, since they bring fitness and cultural capital. Excessive interest in bird watching marks you out as an eccentric.

Taking the superfluous seriously is a brave act in a utilitarian world, so leisure needs its own social legitimacy to thrive. This used to come from class-based associational life, with its clubs, unions and organised religion. If video games and social media smack of pseudo-leisure, it is because they are often part of a lonely struggle with the productivity impulse: they palliate restless and atomised minds. Maybe the only forms of leisure with a more than marginal role in popular culture today are amateur football, travel and the pub.

Aristotle thought a political community should exist to provide the conditions for leisure, which he saw as the key to human flourishing. At the very least, it is crucial for a balanced existence. Meaningful work, entertainment and indulgence all have their place, but they become destructive in excess. Life should be more than an on/off switch. Leisure is the space for conversation and reflection, friendship and loyalty, playfulness and joie de vivre. These are not qualities we can develop because we want them on our CVs: they are by-products of doing something for its own sake.

In a more civilised society, leisure would define our identities as much as labour does. To see what a distant prospect that is, try to imagine a politician talking about activities that might bring satisfaction to our lives half as much as he or she talks about “ordinary working people” or “hard-working families”. Celebrating leisure would be branded out-of-touch, but that is because we have accepted the disgraceful assumption that enjoyable pastimes are only for those who can afford them.

Asset-holding baby boomers are the masters of leisure today, using retirement for tourism, sport and artistic dabbling. Good for them. Still, we should resist the idea that such opportunities must be earned by decades of graft. This morality feels natural only because we don’t acknowledge our common interest in leisure. We accept everyone wants higher pay, so why treat activities that enrich our culture as an extravagance?

The struggle to keep work in its proper place has already consumed a generation: the lifestyle guru Tim Ferriss published his bestseller The 4-Hour Workweek in 2007. It seems not all of us want to be our productive selves even on the toilet.

But it’s equally clear that blank slots carved out of our personal timetables are too flimsy: you cannot beat discipline with discipline. It would be better if we combined our productive energies and channelled them towards reviving the art of leisure.

Fully Automated Desert Dystopia

This is my latest newsletter published at Substack. Read more and subscribe here.

It takes some chutzpah, you would think, for Saudi Arabia to portray itself as a modern, forward-thinking state. Ruled by crown prince Mohammed bin Salman, the country is an authoritarian theocratic monarchy. Political parties are outlawed. No religion other than Islam can be openly practiced, and apostasy is legally punishable by death. Women have only been allowed to drive cars since 2018.

But there are other ways, outside the western liberal paradigm, for a regime to assert its progressive credentials – especially if it happens to control fifteen percent of the world’s known oil reserves. As bin Salman has shown, one effective way to harness the romance of “the future” is through design.

Witness Neom, a spectacular plan to fill the Saudi desert with hi-tech cities and resorts. One of these is already famous: “The Line,” advertised as a radically new kind of city. Nothing has been built yet, but the hype has been endless, and very successful in publicity terms. A new documentary by the Discovery Channel (or rather, a promotional film in the guise of a documentary) is just the latest instalment of it.

The Line promises to make science fiction reality. Two enormous parallel walls, taller than the Empire State building, will run for 170km through the desert. In the narrow gap between them will be a “vertical city” for nine million people. That means, in effect, the population of greater London living in a single, very long skyscraper, just three times wider than a football pitch. Everything residents need will be accessible within five minutes, it will all be powered by renewable energy, fully automated, entirely car-free, etc. etc.

“Progressive,” “futuristic” and “poetic” is how The Line is described in the new film. As bin Salman himself puts it, “we have the cash, we have the land, we have the stability,” and now “we want to create the new civilisation for tomorrow.”

Digital rendering of life inside “The Line.” (Image: Neom)

The crown prince’s campaign for cultural prestige does not end there. His regime’s $650 billion Public Investment Fund is being ploughed into an array of fashionable consumer industries and green technologies, from coffee and vaping to electric cars and hydrogen-powered buses.

Bin Salman could be compared to the “Enlightened despots” of the 18th century, rulers who used their absolute power to enact progressive reforms. And just as Enlightenment philosophers were happy to act as consultants for Frederick the Great or Catherine the Great, a long list of high-profile architects and designers have flocked to bin Salman’s court. Many of them, including the erstwhile enfant terrible and Archigram founder Peter Cook, can be seen praising their client’s imagination and insight in the Discovery Channel film. 

As I wrote last year, Neom has revealed a certain synergy between designers and autocrats. Unaccountable rulers like bin Salman offer vast resources and creative freedom; his architects can implement “any kind of technology… or urban design solution,” as one project manager puts it. That is an opportunity ambitious designers dream about, and in return, many are only too happy to deliver a grandiose project that glorifies their employer’s power. If all this can be presented as vital to the future of humanity, so much the better.

But I think the enthusiasm for Neom reflects something deeper than artistic vanity, something in the makeup of modern design itself. By and large, designers like to create things that are functionally efficient, rational, and optimised for specific outcomes. That applies to urban planning as much as product design, the obvious difference being that in something as big and messy as a city, there is rarely an opportunity to start from scratch. New developments have to fit into an existing landscape that has evolved chaotically over time. 

With Neom, there are no such constraints. The designer’s love of functionality and order can be indulged to an enormous extent. Consider the fantasies of one planner, who wants to use her “passion as a tool for positive social change,” as reported by Bloomberg:

Imagine a sixth grader, she says. When he wakes up, his home will scan his metabolism. Because he had too much sugar the night before, the refrigerator will suggest porridge instead of the granola bar he wanted. Outside he’ll find a swim lane instead of a bus stop. Carrying a waterproof backpack, he’ll breaststroke the whole way to school. … If all goes well, she says, residents can expect an extra 10 years of ‘healthy life expectancy.’

This is human life reduced to a design problem, its smooth functioning almost indistinguishable from that of the technology that surrounds and supports it. The same tendency is apparent in the Discovery Channel film, where architects discuss The Line as though it were a new smartphone, rather than a supposed home for millions of people.

Digital rendering of life inside “The Line.” (Image: Neom)

This attitude recalls some of the worst excesses of Modernism, such as the “Functional City” discourse launched by CIAM in the 1930s. Here urban life was separated out into discrete “functions,” as though society was something that could be reorganised into labelled drawers. Today’s urbanism is based on very different ideas, but there is no reason to think the results will be any less remote from people’s real needs. A city is simply too complex a thing to be re-engineered from the top down; attempting to do so is pure arrogance.

The Line resembles nothing so much as a setting for a cookie-cutter Hollywood sci-fi. When a plan is consistently sold as “futuristic,” it generally means the designers are more interested in the concept and the aesthetics than the practical reality. One can only hope they are aware that little of it will actually be built, and are just happy to go along with a petro-dictator’s publicity stunt. The other possibility – that they think they can imagine a new society into being – would be far worse.

This is my latest newsletter published at Substack. Read more and subscribe here.

Nostalgia for the Concrete Age

This is my latest newsletter published at Substack. Read more and subscribe here.

Our forebears did not think as we do. In the late 1940s, the Soviet émigré Berthold Lubetkin served briefly as chief architect for Peterlee, a new town in England’s northeast coalmining region. The glorious centrepiece of Lubetkin’s vision? A highway carving through the middle of the town.

“Young couples,” he dreamed, “could sit on its banks watching the traffic, the economic pulse of the nation, with coal and pig iron in huge lorries moving south, while from the south would come loads of ice-cream and French letters.” (A French letter, in case you were wondering, is a condom).

Today this sounds vaguely dystopian, like a dark fantasy from the pages of J.G. Ballard. The English motorway, now a site of daily torture for thousands, is not easily romanticised. But the strange thing is that, if Elon Musk suddenly stumbled on some new mode of transport that made our roads obsolete, Lubetkin’s poetic view of asphalt and traffic would quickly resonate again. The M6 would become the subject of coffee table books.

That is the pattern we see with other architectural relics from the decades following the Second World War. Concrete is cool again. This week, I visited boutique fashion brand Margaret Howell, which is marking the London Festival of Architecture with a gorgeous photography exhibition on the theme of British cooling towers. These are the enormous, gracefully curving concrete bowls that recycle water in power stations. Except they are now disappearing along with the UK’s coal-fired power stations, and so the 20th Century Society is campaigning to save some of these “sculptural giants” for the nation’s heritage.

Elsewhere at the LFA, there is an exhibition celebrating gas holders, another endangered species of our vanishing industrial habitat. And this is all part of a much wider trend over the last decade. In that time, a number of histories have been published that try to rebut the negative view of Modernist architecture from the 1960s and 70s. As I mentioned in an earlier post, there was outrage among design aficionados when, in 2017, the Brutalist estate Robin Hood Gardens was demolished.

A candlelit dinner in Shropshire, UK, during the 1972 power cuts. In the background are the cooling towers of Ironbridge B power station. Image: David Bagnall.

The mania is still growing. In Wallpaper magazine’s recent list of exciting new architecture books – a good barometer of fashionable taste – there are more than ten which celebrate post-war Modernism and Brutalism.

I welcome this tenderness for a bygone age. We should save our finest cooling towers and council estates from the wrecking ball. Some of them are very beautiful, and certainly an important part of our history. But I am a romantic in these matters; I see just about any scrap of the past as the spiritual equivalent of gold or diamonds. The question is why creatives, a devoutly progressive bunch, have become so attached to the concrete age. They don’t show the same sympathy for the dominant tendency of any other period. Wallpaper does not promote eulogies for Georgian terraces or Edwardian monuments.

There is doubtless an element of épater la bourgeoisie here. Creatives like to kick against conventional taste, which has long regarded mass-produced housing as depressing and Brutalist buildings as an eyesore. Fashion goes where the unfashionable do not, which in the built environment means exposed concrete.

There are deeper reasons of course. They can be seen in Lubetkin’s utopian vision of the highway, a structure that brings workers the rewards their industry deserves. In Britain and elsewhere in western Europe, the post-war decades were a time of unusual commitment to equality, solidarity and social progress, the golden age of the welfare state. This “Spirit of ’45,” as Ken Loach’s sentimental film called it, is uniquely cherished by the British left in the same way the New Deal era is by the American one. 

Crucially, the architecture of the time is not only seen as embodying these ideals, but as embracing a bold, modern approach to form at the same time. It represents the dream that artistic virtue can dovetail with social virtue. This is most obvious in some of the ambitious housing, schools and municipal buildings designed between the 1950s-70s, but even the cooling towers are part of this story. They were born from the nationalisation of Britain’s electricity grid in 1948.

What makes this lost world feel relevant now is that it ended with the arrival of the neoliberal era in the 1980s, and opposition to neoliberal principles has defined progressive politics in recent decades. Champions of the post-war project never fail to mention that, while it was far from perfect and sometimes disastrous, at least there was a commitment to providing decent conditions for everyone. 

Gérard Grandval’s Les Choux de Créteil (1974), photographed by Nigel Green for the new book Brutalist Paris

Still, let’s not pretend this reverence for the past is just about finding inspiration for the future. Empathising with the hopes and dreams of a distant era, savouring its aesthetic flavour, feeling the poignance of its passing: there is a word for this combination of emotional acts. It is nostalgia.

Of course that word is a dirty one now. At the cultural level, it is associated with irrationality, resentment, and hostility to change. It is pinned on old people who vote the wrong way. But that, surely, explains the appeal of the concrete age. Thanks to its creative and political legacy, it provides cover for people of a progressive bent to indulge the nostalgic sentiments they otherwise have to suppress.

It’s unfortunate that such alibis are needed. Nostalgia is not a pathology; it is part of the human condition. A deep-felt sense of loss is the inevitable and appropriate response to the knowledge that all things in this world, good and bad, are transient. The pathos of that fact can be deflected and disguised, but it cannot, ultimately, be denied.

This is my latest newsletter published at Substack. Read more and subscribe here.

Against Minimalism

This essay was published by The New Statesman in March 2023, under the headline “Life After Apple.”

Make it simple: this is the design formula that rules our world. 

It is the ethos of user-friendly minimalism, whereby complex gadgets are made both stylish and easy to operate. For this we can thank Apple, which probably designed or inspired the device on which you are reading this now. Though frankly, Apple does not need your thanks; having made its sleek aesthetic fashionable everywhere, it is now the most valuable company in the world.

The Apple look belongs to a different universe to earlier forms of minimalism. This is not Zen Japanese minimalism, chilly Scandi minimalism, or even stern Bauhaus minimalism. Apple’s approach is not about being content with less, or democratising good design. This minimalism belongs to a culture obsessed with personal productivity and consumption, and its purpose is to integrate our lives ever more closely with digital technology.   

Apple products owe their look and feel primarily to Jony Ive, chief designer at the company between 1997 and 2019. Ive’s brilliance lay in his ability to break down barriers between people and technology. He began by developing gadgets with an inviting, tactile appearance – see the jellybean-like white plastic and rounded corners of the classic iPod – making them seem less geeky and intimidating. Above all, Ive ruthlessly purged products of complexity at every level, from the engineering and controls to the digital interfaces and graphics.

The point was to make the user’s experience as frictionless as possible. Ive wanted us to feel thoughtlessly comfortable with our devices, intuitively grasping the purpose of every button and icon. In the process, he reached the holy grail of commercial product design: objects that people regard as special to them, despite everyone else having one too.

Some say Apple is already past its peak, even if its market share is still expanding. Gone is the era when the company was regularly launching transformative products: iMac, iPod, Macbook, iPhone, iPad, Apple Watch. But the Apple vision no longer needs its creator to continue metastasising: it is all around us, and its influence is only growing.

See the race to turn the car into an “iPhone on wheels”, corporate-speak for “another place to consume media”. The BMW i Vision Dee is one of the recent prototypes that follows this path. Its ultra-minimalist interiors do away with buttons, switches, and even screens, replacing them with a voice-controlled display on the windshield. Here you can read social media posts, watch films and one day do the metaverse thing. As BMW’s chief executive declared, encapsulating the main thrust of design today, “our car will integrate seamlessly with your digital lives”.

Contemporary minimalism gives new life to the motto “less is more”. Sleek devices pile up in the aspirational home, never quite amounting to clutter: KitchenAid juicers and Nespresso coffee machines, Alexas and Google Assistants, smart speakers and smart alarm clocks. Even sex toys look like Apple products now.

Domestic life is more and more a dance with machines, which listen to our conversations and monitor our sleep. The same principle operates in the digital world, where graphic design is streamlined into easily digestible blobs of colour and bold lettering, lubricating the endless flow of media. Across the board, corporate giants have stripped their logos of detail, from Burger King to Warner Bros, Burberry to Google.

In this way, minimalism has led us to mistake efficiency for beauty. It has provided aesthetic cover for a gamified capitalist ethic ­– produce, consume, compete – to penetrate ever deeper into our lives. The ultimate example is of course Apple’s own smartphones, laptops and tablets, whose subtle forms and neatly organised contents are tempting to use in any situation. And so the social pressures that live in these devices, the demands of work and the pull of the online crowd, have overrun both private and public life.

Airpods are another step in this conquest: a piece of tech discreet enough to allow media consumption wherever we may be. Together with the iPhone, these white earbuds (owned by three out of four American teenagers) have surely secured Apple the title of history’s most anti-social product designers. Public spaces are now populated by zombies staring at their phones or locked away in a private world of audio.

This inoffensive, tasteful simplicity is a deception. Minimalist devices may be user-friendly, but their lithium batteries and short lifespans are not friendly to the planet. There is nothing simple about Apple’s supply chain, which consists of around 1.5 million workers, most of them employed by contract manufacturers in China. Producing several hundred million iPhones annually breaks human beings. Worker turnover is so high that some factories effectively have to replace their entire labour force several times each year.

The cult of simplicity has spread beyond the realm of gadgets. As the writer Stephen Marche observed a few years back, contemporary novelists like Sally Rooney have abandoned the pursuit of a unique voice in favour of concise, vacuum-packed prose that might have been written by anyone. Popular non-fiction writers like Malcolm Gladwell and Yuval Noah Harari are similarly abstemious in their style, not to mention their habit of streamlining subject matter into big simple ideas.

These trends no doubt point back to the problem of information overload, but they also suggest that minimalism is becoming a cultural sensibility. Increasingly, successful literature resembles the compact efficiency of the book covers and typefaces in which it is packaged.

You may be asking, can the answer really be user-unfriendly design? That would be missing the point. “User-friendly” is only a virtue in products that reduce us to the status of users. Of course tools will always be necessary, and we might as well have good ones, but the goal today is to prevent our tools from taking over our lives.

Fashion and furniture designers, architects and art directors do not need to limit themselves to the functional requirements of user experience. It’s time for these aesthetic practitioners to break the spell of minimalism, which has made slabs of plastic and glass feel like the natural centre of our existence. Embrace the organic, the baroque, the maximalist. Embrace surrealism if you have to. Embrace difficulty and texture. Anything that does not come easily, or make itself instantly understood.

This revolt has already been brewing for some time, as Paris fashion week recently demonstrated. Among the works on show were Reebok and Botter’s curvaceous trainers inspired by seashells, and a sublime runway designed by Joana Vasconcelos, where textiles in the form of bulbous tentacles drooped from the ceiling. Yes, these strange shapes are more visual traffic passing across our screens. But perhaps a world designed in this spirit would remind us there is more to life than what our minimalist machines can offer.

Dilemmas of Displaying the Dead

This essay was published by Unherd in June 2023

The last weeks of Charles Byrne’s life were nightmarish. Known as the Irish Giant, the seven-foot seven-inch man from Ulster had made his way in 1782 to London, where he earned money by exhibiting himself as a freak. By the end of that year tragedy was overtaking him. He was addicted to alcohol and suffered from the painful effects of a pituitary tumour in his brain, the cause of his gigantism. The accrued savings of his 22 years of life — around £700 — had been stolen in a Haymarket pub.

Even in this condition, Byrne was allowed no dignity. The city’s anatomy schools were eager to dissect his body as a scientific prize. Among these circling vultures, none was more determined than the aptly named John Hunter, eminent surgeon, anatomist, and collector of organic specimens both animal and human.

A horrified Byrne had already rejected Hunter’s offer to buy his corpse and, in a final, desperate bid to escape the surgeon’s saws, asked his friends to encase his body in lead and sink it in the English Channel after he died. But Hunter managed to pay for the cadaver to be secretly removed from its coffin and transported to his home in Earl’s Court. There he boiled it down to its bones and reassembled it as a skeleton. “I lately got a tall man,” he hinted to a friend some years after.

The surgeon’s vast collection of pickled creatures and body parts would later become the nucleus of London’s Hunterian Museum. But last month, when the Hunterian reopened after a lengthy closure, the Irish Giant had been tactfully removed from display. After almost 250 years, John Hunter’s flouting of a dying man’s wishes is catching up with him.

There are, of course, many museums that display the remnants of people wrenched from their graves — or of those never allowed to lie down in them. Stories such as Byrne’s raise uncomfortable questions about this practice. When, if ever, do human remains cease to be human? Does the sanctity of death end at the borders of our own culture and era?

These issues have arisen before. Thirty years ago, the South African government demanded the return of Sara Baartman, a Khoisan woman who in the early-19th century was paraded around Europe, only to be dissected after her death and displayed in a Paris museum until the Seventies. But the morality of displaying human remains has become more broadly contentious in recent years.

In 2020, Oxford’s Pitt Rivers museum removed all of its human exhibits, including shrunken heads from Amazonia’s Shuar tribe, claiming that“visitors often understood the Museum’s displays of human remains as a testament to other cultures being ‘savage,’ ‘primitive’ or ‘gruesome,’” which “reinforced racist stereotypes”. Numerous British and American museums have changed their method of displaying Egyptian mummies, an enormous crowd-pleaser, using terms such as “mummified person” in an effort to humanise the objects.

It is striking, then, how proudly the Hunterian Museum now reveals its gruesome contents to the public. It seems Charles Byrne was omitted because, like Sara Baartman, he is a high-profile case, subject to ongoing controversy after the British Medical Journal covered it in 2011. But the museum is still packed with human remains, presented no differently from the countless animal specimens floating eerily in their glass jars. There is row upon row of skulls gathered from numerous continents, pickled brains, warped spines, infant skeletons, cabinets of teeth, all manner of internal organs, and foetuses ranging from nine weeks to full term. It is a truly ghoulish spectacle.

Hunter claimed to have “dissected some thousands” of human corpses. A small number did consent; the Georgian upper classes were warming to the idea of donating their bodies for scientific enquiry. An Archbishop of Canterbury, several military leaders and a serving prime minister (the Marquess of Rockingham) were among those who volunteered for Hunter’s knife.

But the vast majority who ended up in 18th-century anatomy theatres had no say in the matter. Some were wrestled away from their families beneath Tyburn Tree, the gallows in Hyde Park where dozens of criminals were hanged every year. Others were acquired through the bribing of undertakers. Most commonly though, they were stolen from their graves by gangs of professional body snatchers. Hunter himself almost certainly robbed graves in his youth, when he spent 12 years learning the ropes at his brother’s anatomy school.

The grim provenance of Hunter’s collection is addressed only in a brief wall text at the museum. Acknowledging the specimens were gathered “before modern standards of consent”, it states: “We recognise the debt owed to those people… who in life and death have helped to advance medical knowledge.” Why, then, has the display of Egyptian mummies come to be regarded as a sensitive problem, but less so the display of an unborn child probably removed from the womb of a stolen corpse?

One reason is simply that the humanity of the dead only becomes an issue when someone makes it an issue. The controversy over mummies, for instance, reflects a particular convergence of political beliefs: some modern Egyptians, not to mention the modern Egyptian state, are now identifying as descendants of the ancient civilisation on the Nile. At the same time, Western curators have become desperate to distance themselves from the colonial period during which these objects were acquired. By contrast, there are few people in Britain who feel so strongly about scores of impoverished Londoners pulled from their shallow graves in the dead of night.

But there is another important difference. The claim that Hunter’s activities “have helped to advance medical knowledge” is a powerful one, linking his specimens with the achievements of modern medicine. It is also clearly true. Without a legal way to acquire bodies — and with religious beliefs making voluntary dissection unthinkable to many — only stolen corpses could produce the beginnings of the anatomical knowledge that we take for granted today. The museum subtly emphasises this by charting the development of surgery from the early-modern period to our own time: rather dull after the horror show of Hunter’s collection, but that’s the point I suppose.

Charles Byrne’s skeleton might be too controversial to display, but the museum has insisted on keeping it due to its medical value. It helped an American neurosurgeon to identify pituitary gigantism in 1909, and a century later, allowed scientists to find a genetic component in the growth disorder.

What all of this points to is the special status of medical science in Western countries today. Museums and other cultural institutions are increasingly critical of the heritage they embody because, ultimately, they no longer believe it has served a positive purpose that could mitigate the brutality of the past. This goes far beyond the problem of human remains; as Guardian critic Jonathan Jones notes about Tate Britain’s recent guilt-laden rehang: “Maybe it doesn’t want to promote British art, for it seems to disapprove of much of it.” Yet there are not many people arguing that we should abandon the benefits of modern medicine since it, too, has a disturbing history. This is one area where progress is still understood as building on the past rather than overturning it: the only acceptable agenda for healthcare is more and better.

But Hunter’s collection also reveals a deep tension in the way we value medical science. If we consider it dehumanising to display body parts in jars, it is partly because we now struggle to recognise blood and tissue as human. Our technical mastery over biology has led to our alienation from it. Just as we expect our meat to arrive immaculately packaged in the supermarket, carrying no trace of the abattoir, so we banish birth, illness, and death from our everyday lives, consigning them to the clinical world of the hospital. We have never been more preoccupied with the condition of our bodies, yet we don’t like to see those bodies for what they really are.