The Weird World of Cruise Ships

This essay appeared in my regular newsletter, The Pathos of Things, in August 2023. Subscribe here.

I have never been on a cruise ship, and probably never will. To be trapped in a crowded tourist resort miles from the nearest land is relatively close to my idea of hell. Yes, with enough alcohol and the right company, I can imagine it would be fun for a day or two (readers are of course welcome to sponsor a research trip). A week would probably drive me to the edge of insanity.

Still, I find these seaborne cities fascinating, enchanting even. If you boiled the spirit of the modern hospitality business down to its purest concentrate, you surely get something like the Icon of the Seas, the enormous new cruise ship whose digital renderings have recently gone viral online. Imagine Hieronymus Bosch painting Disneyland, and you get some sense of the diabolical energy of this project. I can think of no other artefact that employs design, engineering and industrial capacity on this scale for the sole purpose of escapism and pleasure.    

When the Icon of the Seas begins roaming the Caribbean in January, its 7,500 passengers will have access to on-board water parks, surf simulators and climbing walls, along with the usual menagerie of bars, restaurants, cinemas, theatres and gyms, all ranged over twenty decks. This represents a pinnacle of luxury consumption and of travel as an organised departure from reality (cruise ships have no worldly destination, sailing in giant circles back to their origin). It should come as no surprise that the recently deceased Italian prime minister and media mogul Silvio Berlusconi, who pioneered the contemporary mode of politics as entertainment, started out as a singer on cruise lines.

While a large majority of passengers once came from North America, cruise holidays are now wildly popular around the world, visiting not just the Caribbean but the Mediterranean, Baltic, North Pacific and South China Sea. Passenger numbers are climbing back towards their pre-pandemic peak of almost thirty million per year.

The origins of this institution lie with the great ocean liners of the late-19th and early-20th centuries. These were already floating fantasy habitats, offering first class passengers much more than the luxuries portrayed in James Cameron’s Titanic. As Thomas Kepler writes, the most extravagant German liners boasted “Louis XIV and Moorish lounges, winter gardens, and restaurants… sometimes two or three decks in height, gilded to the hilt with Wagnerian kitsch.” The Imperator, launched in 1912, had a first-class smoking room “in the guise of a Bavarian hunting lodge,” as well as a neoclassical swimming pool called the Pompeian Bath. The White Star Line’s Olympic (sister ship to the Titanic) boasted a Turkish bath, a gymnasium and squash courts.

With the emergence of modern cruise ships in the 1960s, some of them converted ocean liners, such seaborne decadence gradually became available to a prosperous American middle class. One difference was that ocean liners were a means of transport, not simply indulgence, so their aesthetics were partly designed to distract from rough seas.

In the intervening decades, cruise ships have become such enormous and complex artefacts that very few shipyards are capable of building the biggest specimens. Just four in fact, of which three, interestingly, are in mainland Europe. But you shouldn’t imagine the ships being built from scratch in a single place. Construction is now highly modular, with up to eighty separate “blocks” – themselves composed of sub-blocks – being built simultaneously at different locations, complete with electrics, plumbing and furniture, before they are all slotted together. The result is an incredible concentration of different technologies, from gourmet kitchens to discotheques, in a single designed object.

The reason cruise ships have become so huge is largely financial: more passengers, more cash flow and fewer overheads. Since speed is no longer a priority, the elegant profile of the ocean liner has given way to the floating tub, maximising space for people and attractions. But reaping these benefits requires vast upfront capital investments and technical expertise, which means fewer and fewer firms can compete. The same dynamics can be found in many fields of industrial design; reading about the cruise ship industry is, strangely, quite similar to reading about the semiconductor industry.  

As a form of architecture, the cruise ship might recall Robert Venturi and Denise Scott Brown’s celebration of Las Vegas as a democratic landscape of fun, as meaning for the masses. But we could equally look in the opposite direction from these postmodern theorists, to the great Modernist ideologue Le Corbusier.

The Unité d’Habitation, Le Corbusier’s famous 1952 housing block in Marseilles, bears many striking similarities to the cruise ship. It too uses a modular structure of identical apartment units, fitted like cells into a concrete frame. In terms of overall form, it certainly resembles something modern civilization would put on the sea. What is more, in keeping with the Modernist vision of the self-contained community, the Unité incorporates a range of leisure facilities, including a shopping arcade, running track, gymnasium and rooftop paddling pool.

What to make of these echoes? It is tempting to see the cruise ship as a kind of temporary reprieve from suburbanisation. Around the world, middle class families have spurned cities for larger, more dispersed dwellings, which presumably makes an occasional intense shot of communal experience quite attractive. Maybe the cruise holiday is, like the Unité d’Habitation, a form of utopian urbanism: a rare opportunity for designers to create a dense, cosmopolitan environment that is genuinely popular.

The lesson for urban planners? Free cocktails make a big difference.

Space architecture: a moonage daydream?

This essay was originally published by Engelsberg Ideas in January 2024.

In January, the architecture studio Hassell published designs for a settlement to house 144 people on the moon. Commissioned by the European Space Agency (ESA), this ‘Lunar Habitat Master Plan’ shows a series of inflatable pods in which settlers will live, eat, exercise and cultivate plants. A protective shell of lunar soil, to be 3D-printed on site, will shield the structures from devastating levels of radiation on the moon’s surface. Nor will this life be without leisure and style. Hassell’s renderings include a cocktail bar with elegant coffee tables, atmospheric lighting, and moulded chairs that carefully echo the imagery of science fiction.

This proposal is just the latest in a growing genre of architectural projects for sites beyond Earth. The last decade has seen a flurry of eye-catching designs for both the moon and Mars. Hassell has previously imagined something even more swish than its lunar cocktail bar: a Martian abode with timber-effect flooring, houseplants and minimalist furniture. It is a vision fit for an IKEA advert, down to the young couple whose relaxing evening we can glimpse through the generous picture window. Meanwhile, NASA has enlisted architects from two studios, Bjarke Ingels Group and SEArch +, to work on lunar structures. Both firms have already been involved in space-related projects, with the latter proposing a Martian community living in igloos of sculpted ice.

Another idea for Mars comes from the high-profile Foster + Partners: mound-like dwellings of fused rubble, assembled by teams of robots arriving via parachute. Perhaps the most ambitious concept, courtesy of the architects at Abiboo, imagines a vast rabbit warren of tunnels embedded in a Martian cliff-face, containing a metropolis of 250,000 people.

I could go on, but it should already be apparent that this burgeoning field of space architecture involves considerably more fantasy than concrete planning. The problem is not necessarily a lack of detail: many projects indulge in technical discussion of materials, construction methods and service systems. But given that the longest habitation of the moon to date was the three days the Apollo 17 crew spent camping in their rover in 1972, while no person has ever set foot on Mars, it is clear these futuristic structures and fashionable interiors really belong to the realm of science fiction. We shouldn’t be surprised that the winners of one NASA competition for a Mars base also want to‘harness the power of comets for interplanetary transportation’, or that Abiboo’s Martian city proposal requires steel-making technology ‘that will need to be developed’.

So what exactly is the point of these designs, and why are agencies such as NASA and the ESA bothering to commission them? Ultimately, speculative space projects tell us more about architecture as a practice and an industry here on Earth than they do about future settlements on distant celestial bodies. This is not to say that such schemes will never bear fruit, but such fruits are likely to emerge much closer to home, as part of architecture’s ongoing fascination with the idea of space.

The notion of lunar or Martian architecture is not necessarily absurd. We are on the cusp of a new Space Age, and this time the aim is not just to visit other parts of the solar system, but to establish a lasting presence there. The first destination is the south pole of the moon, where there are craters containing frozen water. Last August, India’s Chandrayaan-3 mission landed an unmanned spacecraft tere for the first time. The main players, however, are the United States and China, who have both assembled international coalitions for space exploration. NASA’s Artemis program, in cooperation with more than 30 nations, hopes to have astronauts at the lunar pole by the end of the decade, and built structures in place by 2040. Unlike the earlier Apollo missions, Artemis can draw on a buoyant private-sector space industry, including rockets and spacecraft designed by Elon Musk’s SpaceX. Musk has claimed that he is amassing his vast personal wealth in order to make humanity a ‘multi-planetary species’.

Meanwhile, on a similar timetable, China’s Chang’e program is hoping to establish its own International Lunar Research Station, with partners including Russia, Egypt and Pakistan. Both the American- and Chinese-led efforts have ostensibly scientific aims. The moon promises a wealth of information about the distant past of the solar system, including the formation of Earth itself, but it would be naïve to imagine this new space race is about a disinterested search for knowledge. The moon is expected to furnish valuable supplies of Helium-3, potentially useful for nuclear fusion, as well as rare earth metals. More importantly, the hydrogen and oxygen at the moon’s south pole can be used for rocket propellant, allowing space travellers to continue on to further destinations. This is no longer just idle speculation: all parties now see the moon as a stepping-stone towards the future colonisation of Mars.

The problem is that building anything in these distant places, let alone living there, involves enormous challenges of engineering and logistics. Lunar pioneers will need to endure temperature swings of hundreds of degrees Celsius between day and night, along with constant bombardment by micrometeorites. On Mars, they can look forward to average temperatures of -60C and frequent deadly sandstorms. Both places are subject to intense cancer-causing radiation, and a total absence of soil suitable for growing food. Even breathable air needs to be manufactured.

Space agencies are investigating potential infrastructure for a moon base, including water extraction, satellites for communication, and electricity from solar and nuclear plants. The difficulty of transporting heavy materials such a long way, however, means that any construction will have to make use of the sparse elements already in situ, especially ice, water, and the rubble known as regolith. There will be no workforce, so structures will need to be built robotically or by the astronauts themselves. Before we worry about the aesthetics of the lunar hospitality industry, engineers face a more basic question: can we even make bricks in these conditions? The firm that NASA is hoping will develop its construction technology, the 3D-printing specialists ICON, has only been receiving funding since 2020.

Thinking ahead can be valuable. There is little use developing construction techniques if we don’t have some sense of how we want to employ them. By visualising the endpoint, architecture can help to specify the problems that the engineers need to solve. Besides, living on the moon for an extended period will require much more than engineering. According to architect Hugh Broughton, whose work on polar research bases is providing inspiration for NASA, the deeper problem is ‘how architecture can respond to the human condition’. When designing for extreme conditions, the architect has to consider ‘how you deal with isolation, how you create a sense of community… how you support people in the darkness’. This is worth bearing in mind when we see lunar proposals that resemble a leisure centre or cruise ship. Research bases in the Antarctic include amenities such as cinemas and themed bars. Their purpose is, above all, to provide a sense of familiarity. The German firm Duravit has even considered this psychological need in its design for a Martian toilet, which resembles an earthbound one despite working in a different way.

Nonetheless, there remains an indulgent quality to recent space designs. The Ukrainian studio Makhno has envisaged a Martian settlement along the lines of a luxury spa, complete with swimming pools and bucolic rock gardens. It even has a meditation capsule for ‘recovery, restart of consciousness, and immersion into the inner’. No doubt there is publicity value for architects in these imaginative projects – like the concept cars that automakers show at fairs – but this then raises the question of what virtues architects are trying to showcase, and why space is the appropriate medium.

Decades ago, the architect and critic Kenneth Frampton noted a tendency in the avant-garde to imagine impossible projects, which he diagnosed ‘as the return of a repressed creativity, as the implosion of utopia upon itself’. Frampton was pointing to the tension between the ideals and criteria of excellence that animate modern architecture and the highly constrained reality in which architects actually operate. Architecture aspires to engage with deep problems of social life, and also to achieve aesthetic or technical brilliance. Yet, across much of the developed world, innovative approaches to the built environment are stifled by all manner of restraints, from restrictive planning codes to the profit margins of property developers. There may be more scope for originality when it comes to designing fancy hotels, art museums and corporate office buildings, but such displays tend to make architecture the handmaiden of business or entertainment.

Following Frampton’s critique, we could see space architecture as a means to demonstrate, in the realm of imagery, the ambition and purpose that are so rarely possible in real buildings. Designing for the moon or Mars offers not just the grandeur of a new frontier in human history, but the creative freedom of a genre whose boundaries are yet to be established, and which is in any case largely speculative.

More particularly, these projects allow a kind of imaginative return to the heroic age of modern architecture, the 1920s and 30s. In the turmoil of Europe following the First World War, for a short while it appeared that the designers of buildings and cities could shape the future of humanity. They felt emboldened to break with the past and develop rational, efficient and functional answers to the problems of modern society. The results ranged from the visionary doctrines of Le Corbusier to the granular ingenuity of a figure such as Grete Schütte-Lihotzky, designer of the galley kitchen, whose basic template we still use today. Space architecture provides a similar opportunity to address fundamental questions of design, from materials and form to the arrangement of functions in the overall plan, without the weight of convention obstructing the most elegant solution.

If architects can use space projects as an outlet for repressed creativity, their work serves a rather different purpose for the organisations that commission them. In an era when imagery carries immense power, digital renderings have become a political medium, called on to visualise the imagined marvels of the future. And space exploration is deeply political. It embroils the relevant agencies in a constant struggle for government funds, forcing them to confront public misgivings about the necessity and cost of their activities. Since the Soviet Union launched Sputnik, the first satellite, in 1957, such doubts have been countered in part through the imaginative appeal of the final frontier; an appeal that has only grown with the rise of visual media. In 2020, when NASA’s Perseverance rover explored the surface of Mars, social media users were thrilled to hear audio of Martian winds, and to see a horizon with Earth sparkling in the distance; that this particular photograph turned out to be fake only underscored the role of fantasy in these matters. Architectural imagery gives form to dreams of colonising the solar system. It thereby helps to justify space exploration not just to politicians, but to a wider audience of media consumers.

Space design reveals a peculiar alignment of interests between architects and their clients. The former can apply themselves to a heroic paper architecture – or rather, digital architecture – for which there is little scope on Earth; the latter, meanwhile, can justify their budgets with images that invoke an exciting horizon of possibility. It would be short sighted, however, to consider such projects only in terms of their immediate motives and plausibility.

The consequences of human engagement with space have always been dynamic and unpredictable. Technology provides the clearest examples: NASA laboratories have inadvertently contributed to all kinds of everyday products, from camera phones and baby formula to running shoes and wireless headsets. We can already see the potential for such transfers in the drive to build in other parts of the solar system. At the University of Nebraska, a team led by the engineer Congui Jin has been developing organic construction materials for Mars. Jin thinks that certain kinds of fungi can assemble minerals from Martian sand and grow them into bricks. If successful, such techniques could find numerous applications on Earth, starting with the repair of damaged concrete and provision of refugee housing in remote areas.

Architecture has its own story to tell about the role of space exploration in the modern world. When Kenneth Frampton made his comments about the frivolity of avant-garde ideas, he had in mind a group of students and municipal architects that appeared in Britain during the 1960s, known as Archigram. In their magazine of the same title, the group explored fantastical and futuristic schemes like the ‘Walking City’ (that is, a city with legs) and the ‘Underwater City,’ while devising all manner of pods, capsules and bubbles in which people might one day dwell. These fantasies were informed by the technology and aesthetics of the Space Age. As Archigram 3 put it, architecture needed something ‘to stand alongside the space capsules, computers and throw-away packages of an atomic/electronic age.’ The first issue had invoked ‘the poetry of countdown… [and] orbital helmets,’ while the second took inspiration from ‘Lunar architecture and shredded wheat… the radiator grille and the launching pad.’

Archigram’s provocations were often more outlandish than the space habitats of recent years. And yet, the group’s influence has been profound. Their disciples include leading practitioners of high-tech architecture, such as Norman Foster, Renzo Piano and the now-deceased Richard Rogers, who have designed many of the world’s most notable buildings over the past half-century. While Archigram dreamed of structures that would capture the ethos of the space age, these architects have designed and built them, often using materials and methods borrowed from the aerospace industry. We can even detect Archigram’s spirit in the inflatable pods that feature in many recent proposals for the moon and Mars.

Strangely then, space architecture is not really the fresh departure it appears to be. When we look at new schemes for settlements beyond Earth, we are seeing a long-extant futuristic sensibility that is now straining closer towards material reality. By the same token, even if we don’t see cocktail bars on the moon anytime soon, the ideas stimulated by such projects may still prove influential for the future of design.

The Lost Art of Leisure

This essay was published by the New Statesman in May 2023, under the headline “You Should Only Work Four Hours a Day.”

Decades ago, Roland Barthes quipped that “one is a writer as Louis XIV was king, even on the toilet”. He was mocking the way literary types like to distinguish themselves from the mass of working people. According to Barthes, writers insist that their productive activities are not limited to any time and place, but flow constantly like an “involuntary secretion”.

Well, we are all writers now, at least in this sense. Stealing a few holiday hours to work on an article used to be my party trick. Now I find that, on Mondays and Fridays when many office buildings stand empty, my salaried comrades are sending emails from an Airbnb somewhere. Come the weekend, they might close their laptops, but they don’t stop checking their phones.

Of course this hardly compares with the instability further down the pay scale. Around one in seven British workers now do gig-economy jobs like Uber or Amazon delivery at least once a week, according to research for the Trades Union Congress, many of them on top of full-time employment.

Work today is fluid, overflowing its traditional boundaries and seeping into new domains. Meditation and exercise look suspiciously like personal optimisation. Artistic vocations centre on tireless self-promotion to a virtual audience. A movement of “homesteaders” churning their own butter and knitting their own jumpers are simply cosplaying older forms of work, and probably posting the results on Instagram.

With the help of our digital tools, we are adapting ourselves to productivity as involuntary secretion. The result is an evisceration of personal life and an epidemic of burnout.

Our diffuse working culture has attracted plenty of critiques. The problem is most of them share the basic outlook that enabled the spread of work to begin with. Should we recognise “quiet quitting” as a justified response to unreasonable demands by employers? Is rest a form of “resistance”? Do we all just need a better “work-life balance”? These arguments present life as a two-way split between work and some nondescript realm of personal freedom, the question being how we can reclaim time from one for the sake of the other.

As long as the alternative to work remains just a negative space, work will continue leaching into it. What we are missing is a real counterbalance: a positive vision of leisure.

Properly speaking, leisure is not rest or entertainment, though it can provide both. It is not mere fun, though it ought to be satisfying. Its forms change over time, but it generally involves elements of play, fantasy and connection with other people or the natural world. Most importantly, leisure is superfluous to our worldly needs and ambitions: something we do not as a means to any end, but simply for its own sake.

Truly mass participation in leisure was a striking feature of British life in the early 20th century. People played in brass bands and raced pigeons. They learned to dance and performed in plays and choirs. In 1926 nearly 4,000 working-class anglers from Birmingham took part in a single fishing competition along 20-odd miles of river. During the 1930s, as the historian Ross McKibbin writes, “one of the great sights of the English weekend were the fleets of cyclists riding countrywards along the arterial roads of the major towns”.

People still do these things, of course, but they do them as hobbies. The hobby belongs to a culture defined by work: it is a creature of downtime and a quirk of character. Hobbies rely on individual enthusiasm, so they often collapse in the face of stress or time pressure. Besides, we tend to judge them by the unleisurely criteria of self-improvement. Physical and intellectual pursuits are admirable, since they bring fitness and cultural capital. Excessive interest in bird watching marks you out as an eccentric.

Taking the superfluous seriously is a brave act in a utilitarian world, so leisure needs its own social legitimacy to thrive. This used to come from class-based associational life, with its clubs, unions and organised religion. If video games and social media smack of pseudo-leisure, it is because they are often part of a lonely struggle with the productivity impulse: they palliate restless and atomised minds. Maybe the only forms of leisure with a more than marginal role in popular culture today are amateur football, travel and the pub.

Aristotle thought a political community should exist to provide the conditions for leisure, which he saw as the key to human flourishing. At the very least, it is crucial for a balanced existence. Meaningful work, entertainment and indulgence all have their place, but they become destructive in excess. Life should be more than an on/off switch. Leisure is the space for conversation and reflection, friendship and loyalty, playfulness and joie de vivre. These are not qualities we can develop because we want them on our CVs: they are by-products of doing something for its own sake.

In a more civilised society, leisure would define our identities as much as labour does. To see what a distant prospect that is, try to imagine a politician talking about activities that might bring satisfaction to our lives half as much as he or she talks about “ordinary working people” or “hard-working families”. Celebrating leisure would be branded out-of-touch, but that is because we have accepted the disgraceful assumption that enjoyable pastimes are only for those who can afford them.

Asset-holding baby boomers are the masters of leisure today, using retirement for tourism, sport and artistic dabbling. Good for them. Still, we should resist the idea that such opportunities must be earned by decades of graft. This morality feels natural only because we don’t acknowledge our common interest in leisure. We accept everyone wants higher pay, so why treat activities that enrich our culture as an extravagance?

The struggle to keep work in its proper place has already consumed a generation: the lifestyle guru Tim Ferriss published his bestseller The 4-Hour Workweek in 2007. It seems not all of us want to be our productive selves even on the toilet.

But it’s equally clear that blank slots carved out of our personal timetables are too flimsy: you cannot beat discipline with discipline. It would be better if we combined our productive energies and channelled them towards reviving the art of leisure.

Fully Automated Desert Dystopia

This is my latest newsletter published at Substack. Read more and subscribe here.

It takes some chutzpah, you would think, for Saudi Arabia to portray itself as a modern, forward-thinking state. Ruled by crown prince Mohammed bin Salman, the country is an authoritarian theocratic monarchy. Political parties are outlawed. No religion other than Islam can be openly practiced, and apostasy is legally punishable by death. Women have only been allowed to drive cars since 2018.

But there are other ways, outside the western liberal paradigm, for a regime to assert its progressive credentials – especially if it happens to control fifteen percent of the world’s known oil reserves. As bin Salman has shown, one effective way to harness the romance of “the future” is through design.

Witness Neom, a spectacular plan to fill the Saudi desert with hi-tech cities and resorts. One of these is already famous: “The Line,” advertised as a radically new kind of city. Nothing has been built yet, but the hype has been endless, and very successful in publicity terms. A new documentary by the Discovery Channel (or rather, a promotional film in the guise of a documentary) is just the latest instalment of it.

The Line promises to make science fiction reality. Two enormous parallel walls, taller than the Empire State building, will run for 170km through the desert. In the narrow gap between them will be a “vertical city” for nine million people. That means, in effect, the population of greater London living in a single, very long skyscraper, just three times wider than a football pitch. Everything residents need will be accessible within five minutes, it will all be powered by renewable energy, fully automated, entirely car-free, etc. etc.

“Progressive,” “futuristic” and “poetic” is how The Line is described in the new film. As bin Salman himself puts it, “we have the cash, we have the land, we have the stability,” and now “we want to create the new civilisation for tomorrow.”

Digital rendering of life inside “The Line.” (Image: Neom)

The crown prince’s campaign for cultural prestige does not end there. His regime’s $650 billion Public Investment Fund is being ploughed into an array of fashionable consumer industries and green technologies, from coffee and vaping to electric cars and hydrogen-powered buses.

Bin Salman could be compared to the “Enlightened despots” of the 18th century, rulers who used their absolute power to enact progressive reforms. And just as Enlightenment philosophers were happy to act as consultants for Frederick the Great or Catherine the Great, a long list of high-profile architects and designers have flocked to bin Salman’s court. Many of them, including the erstwhile enfant terrible and Archigram founder Peter Cook, can be seen praising their client’s imagination and insight in the Discovery Channel film. 

As I wrote last year, Neom has revealed a certain synergy between designers and autocrats. Unaccountable rulers like bin Salman offer vast resources and creative freedom; his architects can implement “any kind of technology… or urban design solution,” as one project manager puts it. That is an opportunity ambitious designers dream about, and in return, many are only too happy to deliver a grandiose project that glorifies their employer’s power. If all this can be presented as vital to the future of humanity, so much the better.

But I think the enthusiasm for Neom reflects something deeper than artistic vanity, something in the makeup of modern design itself. By and large, designers like to create things that are functionally efficient, rational, and optimised for specific outcomes. That applies to urban planning as much as product design, the obvious difference being that in something as big and messy as a city, there is rarely an opportunity to start from scratch. New developments have to fit into an existing landscape that has evolved chaotically over time. 

With Neom, there are no such constraints. The designer’s love of functionality and order can be indulged to an enormous extent. Consider the fantasies of one planner, who wants to use her “passion as a tool for positive social change,” as reported by Bloomberg:

Imagine a sixth grader, she says. When he wakes up, his home will scan his metabolism. Because he had too much sugar the night before, the refrigerator will suggest porridge instead of the granola bar he wanted. Outside he’ll find a swim lane instead of a bus stop. Carrying a waterproof backpack, he’ll breaststroke the whole way to school. … If all goes well, she says, residents can expect an extra 10 years of ‘healthy life expectancy.’

This is human life reduced to a design problem, its smooth functioning almost indistinguishable from that of the technology that surrounds and supports it. The same tendency is apparent in the Discovery Channel film, where architects discuss The Line as though it were a new smartphone, rather than a supposed home for millions of people.

Digital rendering of life inside “The Line.” (Image: Neom)

This attitude recalls some of the worst excesses of Modernism, such as the “Functional City” discourse launched by CIAM in the 1930s. Here urban life was separated out into discrete “functions,” as though society was something that could be reorganised into labelled drawers. Today’s urbanism is based on very different ideas, but there is no reason to think the results will be any less remote from people’s real needs. A city is simply too complex a thing to be re-engineered from the top down; attempting to do so is pure arrogance.

The Line resembles nothing so much as a setting for a cookie-cutter Hollywood sci-fi. When a plan is consistently sold as “futuristic,” it generally means the designers are more interested in the concept and the aesthetics than the practical reality. One can only hope they are aware that little of it will actually be built, and are just happy to go along with a petro-dictator’s publicity stunt. The other possibility – that they think they can imagine a new society into being – would be far worse.

This is my latest newsletter published at Substack. Read more and subscribe here.

Nostalgia for the Concrete Age

This is my latest newsletter published at Substack. Read more and subscribe here.

Our forebears did not think as we do. In the late 1940s, the Soviet émigré Berthold Lubetkin served briefly as chief architect for Peterlee, a new town in England’s northeast coalmining region. The glorious centrepiece of Lubetkin’s vision? A highway carving through the middle of the town.

“Young couples,” he dreamed, “could sit on its banks watching the traffic, the economic pulse of the nation, with coal and pig iron in huge lorries moving south, while from the south would come loads of ice-cream and French letters.” (A French letter, in case you were wondering, is a condom).

Today this sounds vaguely dystopian, like a dark fantasy from the pages of J.G. Ballard. The English motorway, now a site of daily torture for thousands, is not easily romanticised. But the strange thing is that, if Elon Musk suddenly stumbled on some new mode of transport that made our roads obsolete, Lubetkin’s poetic view of asphalt and traffic would quickly resonate again. The M6 would become the subject of coffee table books.

That is the pattern we see with other architectural relics from the decades following the Second World War. Concrete is cool again. This week, I visited boutique fashion brand Margaret Howell, which is marking the London Festival of Architecture with a gorgeous photography exhibition on the theme of British cooling towers. These are the enormous, gracefully curving concrete bowls that recycle water in power stations. Except they are now disappearing along with the UK’s coal-fired power stations, and so the 20th Century Society is campaigning to save some of these “sculptural giants” for the nation’s heritage.

Elsewhere at the LFA, there is an exhibition celebrating gas holders, another endangered species of our vanishing industrial habitat. And this is all part of a much wider trend over the last decade. In that time, a number of histories have been published that try to rebut the negative view of Modernist architecture from the 1960s and 70s. As I mentioned in an earlier post, there was outrage among design aficionados when, in 2017, the Brutalist estate Robin Hood Gardens was demolished.

A candlelit dinner in Shropshire, UK, during the 1972 power cuts. In the background are the cooling towers of Ironbridge B power station. Image: David Bagnall.

The mania is still growing. In Wallpaper magazine’s recent list of exciting new architecture books – a good barometer of fashionable taste – there are more than ten which celebrate post-war Modernism and Brutalism.

I welcome this tenderness for a bygone age. We should save our finest cooling towers and council estates from the wrecking ball. Some of them are very beautiful, and certainly an important part of our history. But I am a romantic in these matters; I see just about any scrap of the past as the spiritual equivalent of gold or diamonds. The question is why creatives, a devoutly progressive bunch, have become so attached to the concrete age. They don’t show the same sympathy for the dominant tendency of any other period. Wallpaper does not promote eulogies for Georgian terraces or Edwardian monuments.

There is doubtless an element of épater la bourgeoisie here. Creatives like to kick against conventional taste, which has long regarded mass-produced housing as depressing and Brutalist buildings as an eyesore. Fashion goes where the unfashionable do not, which in the built environment means exposed concrete.

There are deeper reasons of course. They can be seen in Lubetkin’s utopian vision of the highway, a structure that brings workers the rewards their industry deserves. In Britain and elsewhere in western Europe, the post-war decades were a time of unusual commitment to equality, solidarity and social progress, the golden age of the welfare state. This “Spirit of ’45,” as Ken Loach’s sentimental film called it, is uniquely cherished by the British left in the same way the New Deal era is by the American one. 

Crucially, the architecture of the time is not only seen as embodying these ideals, but as embracing a bold, modern approach to form at the same time. It represents the dream that artistic virtue can dovetail with social virtue. This is most obvious in some of the ambitious housing, schools and municipal buildings designed between the 1950s-70s, but even the cooling towers are part of this story. They were born from the nationalisation of Britain’s electricity grid in 1948.

What makes this lost world feel relevant now is that it ended with the arrival of the neoliberal era in the 1980s, and opposition to neoliberal principles has defined progressive politics in recent decades. Champions of the post-war project never fail to mention that, while it was far from perfect and sometimes disastrous, at least there was a commitment to providing decent conditions for everyone. 

Gérard Grandval’s Les Choux de Créteil (1974), photographed by Nigel Green for the new book Brutalist Paris

Still, let’s not pretend this reverence for the past is just about finding inspiration for the future. Empathising with the hopes and dreams of a distant era, savouring its aesthetic flavour, feeling the poignance of its passing: there is a word for this combination of emotional acts. It is nostalgia.

Of course that word is a dirty one now. At the cultural level, it is associated with irrationality, resentment, and hostility to change. It is pinned on old people who vote the wrong way. But that, surely, explains the appeal of the concrete age. Thanks to its creative and political legacy, it provides cover for people of a progressive bent to indulge the nostalgic sentiments they otherwise have to suppress.

It’s unfortunate that such alibis are needed. Nostalgia is not a pathology; it is part of the human condition. A deep-felt sense of loss is the inevitable and appropriate response to the knowledge that all things in this world, good and bad, are transient. The pathos of that fact can be deflected and disguised, but it cannot, ultimately, be denied.

This is my latest newsletter published at Substack. Read more and subscribe here.

Against Minimalism

This essay was published by The New Statesman in March 2023, under the headline “Life After Apple.”

Make it simple: this is the design formula that rules our world. 

It is the ethos of user-friendly minimalism, whereby complex gadgets are made both stylish and easy to operate. For this we can thank Apple, which probably designed or inspired the device on which you are reading this now. Though frankly, Apple does not need your thanks; having made its sleek aesthetic fashionable everywhere, it is now the most valuable company in the world.

The Apple look belongs to a different universe to earlier forms of minimalism. This is not Zen Japanese minimalism, chilly Scandi minimalism, or even stern Bauhaus minimalism. Apple’s approach is not about being content with less, or democratising good design. This minimalism belongs to a culture obsessed with personal productivity and consumption, and its purpose is to integrate our lives ever more closely with digital technology.   

Apple products owe their look and feel primarily to Jony Ive, chief designer at the company between 1997 and 2019. Ive’s brilliance lay in his ability to break down barriers between people and technology. He began by developing gadgets with an inviting, tactile appearance – see the jellybean-like white plastic and rounded corners of the classic iPod – making them seem less geeky and intimidating. Above all, Ive ruthlessly purged products of complexity at every level, from the engineering and controls to the digital interfaces and graphics.

The point was to make the user’s experience as frictionless as possible. Ive wanted us to feel thoughtlessly comfortable with our devices, intuitively grasping the purpose of every button and icon. In the process, he reached the holy grail of commercial product design: objects that people regard as special to them, despite everyone else having one too.

Some say Apple is already past its peak, even if its market share is still expanding. Gone is the era when the company was regularly launching transformative products: iMac, iPod, Macbook, iPhone, iPad, Apple Watch. But the Apple vision no longer needs its creator to continue metastasising: it is all around us, and its influence is only growing.

See the race to turn the car into an “iPhone on wheels”, corporate-speak for “another place to consume media”. The BMW i Vision Dee is one of the recent prototypes that follows this path. Its ultra-minimalist interiors do away with buttons, switches, and even screens, replacing them with a voice-controlled display on the windshield. Here you can read social media posts, watch films and one day do the metaverse thing. As BMW’s chief executive declared, encapsulating the main thrust of design today, “our car will integrate seamlessly with your digital lives”.

Contemporary minimalism gives new life to the motto “less is more”. Sleek devices pile up in the aspirational home, never quite amounting to clutter: KitchenAid juicers and Nespresso coffee machines, Alexas and Google Assistants, smart speakers and smart alarm clocks. Even sex toys look like Apple products now.

Domestic life is more and more a dance with machines, which listen to our conversations and monitor our sleep. The same principle operates in the digital world, where graphic design is streamlined into easily digestible blobs of colour and bold lettering, lubricating the endless flow of media. Across the board, corporate giants have stripped their logos of detail, from Burger King to Warner Bros, Burberry to Google.

In this way, minimalism has led us to mistake efficiency for beauty. It has provided aesthetic cover for a gamified capitalist ethic ­– produce, consume, compete – to penetrate ever deeper into our lives. The ultimate example is of course Apple’s own smartphones, laptops and tablets, whose subtle forms and neatly organised contents are tempting to use in any situation. And so the social pressures that live in these devices, the demands of work and the pull of the online crowd, have overrun both private and public life.

Airpods are another step in this conquest: a piece of tech discreet enough to allow media consumption wherever we may be. Together with the iPhone, these white earbuds (owned by three out of four American teenagers) have surely secured Apple the title of history’s most anti-social product designers. Public spaces are now populated by zombies staring at their phones or locked away in a private world of audio.

This inoffensive, tasteful simplicity is a deception. Minimalist devices may be user-friendly, but their lithium batteries and short lifespans are not friendly to the planet. There is nothing simple about Apple’s supply chain, which consists of around 1.5 million workers, most of them employed by contract manufacturers in China. Producing several hundred million iPhones annually breaks human beings. Worker turnover is so high that some factories effectively have to replace their entire labour force several times each year.

The cult of simplicity has spread beyond the realm of gadgets. As the writer Stephen Marche observed a few years back, contemporary novelists like Sally Rooney have abandoned the pursuit of a unique voice in favour of concise, vacuum-packed prose that might have been written by anyone. Popular non-fiction writers like Malcolm Gladwell and Yuval Noah Harari are similarly abstemious in their style, not to mention their habit of streamlining subject matter into big simple ideas.

These trends no doubt point back to the problem of information overload, but they also suggest that minimalism is becoming a cultural sensibility. Increasingly, successful literature resembles the compact efficiency of the book covers and typefaces in which it is packaged.

You may be asking, can the answer really be user-unfriendly design? That would be missing the point. “User-friendly” is only a virtue in products that reduce us to the status of users. Of course tools will always be necessary, and we might as well have good ones, but the goal today is to prevent our tools from taking over our lives.

Fashion and furniture designers, architects and art directors do not need to limit themselves to the functional requirements of user experience. It’s time for these aesthetic practitioners to break the spell of minimalism, which has made slabs of plastic and glass feel like the natural centre of our existence. Embrace the organic, the baroque, the maximalist. Embrace surrealism if you have to. Embrace difficulty and texture. Anything that does not come easily, or make itself instantly understood.

This revolt has already been brewing for some time, as Paris fashion week recently demonstrated. Among the works on show were Reebok and Botter’s curvaceous trainers inspired by seashells, and a sublime runway designed by Joana Vasconcelos, where textiles in the form of bulbous tentacles drooped from the ceiling. Yes, these strange shapes are more visual traffic passing across our screens. But perhaps a world designed in this spirit would remind us there is more to life than what our minimalist machines can offer.

Dilemmas of Displaying the Dead

This essay was published by Unherd in June 2023

The last weeks of Charles Byrne’s life were nightmarish. Known as the Irish Giant, the seven-foot seven-inch man from Ulster had made his way in 1782 to London, where he earned money by exhibiting himself as a freak. By the end of that year tragedy was overtaking him. He was addicted to alcohol and suffered from the painful effects of a pituitary tumour in his brain, the cause of his gigantism. The accrued savings of his 22 years of life — around £700 — had been stolen in a Haymarket pub.

Even in this condition, Byrne was allowed no dignity. The city’s anatomy schools were eager to dissect his body as a scientific prize. Among these circling vultures, none was more determined than the aptly named John Hunter, eminent surgeon, anatomist, and collector of organic specimens both animal and human.

A horrified Byrne had already rejected Hunter’s offer to buy his corpse and, in a final, desperate bid to escape the surgeon’s saws, asked his friends to encase his body in lead and sink it in the English Channel after he died. But Hunter managed to pay for the cadaver to be secretly removed from its coffin and transported to his home in Earl’s Court. There he boiled it down to its bones and reassembled it as a skeleton. “I lately got a tall man,” he hinted to a friend some years after.

The surgeon’s vast collection of pickled creatures and body parts would later become the nucleus of London’s Hunterian Museum. But last month, when the Hunterian reopened after a lengthy closure, the Irish Giant had been tactfully removed from display. After almost 250 years, John Hunter’s flouting of a dying man’s wishes is catching up with him.

There are, of course, many museums that display the remnants of people wrenched from their graves — or of those never allowed to lie down in them. Stories such as Byrne’s raise uncomfortable questions about this practice. When, if ever, do human remains cease to be human? Does the sanctity of death end at the borders of our own culture and era?

These issues have arisen before. Thirty years ago, the South African government demanded the return of Sara Baartman, a Khoisan woman who in the early-19th century was paraded around Europe, only to be dissected after her death and displayed in a Paris museum until the Seventies. But the morality of displaying human remains has become more broadly contentious in recent years.

In 2020, Oxford’s Pitt Rivers museum removed all of its human exhibits, including shrunken heads from Amazonia’s Shuar tribe, claiming that“visitors often understood the Museum’s displays of human remains as a testament to other cultures being ‘savage,’ ‘primitive’ or ‘gruesome,’” which “reinforced racist stereotypes”. Numerous British and American museums have changed their method of displaying Egyptian mummies, an enormous crowd-pleaser, using terms such as “mummified person” in an effort to humanise the objects.

It is striking, then, how proudly the Hunterian Museum now reveals its gruesome contents to the public. It seems Charles Byrne was omitted because, like Sara Baartman, he is a high-profile case, subject to ongoing controversy after the British Medical Journal covered it in 2011. But the museum is still packed with human remains, presented no differently from the countless animal specimens floating eerily in their glass jars. There is row upon row of skulls gathered from numerous continents, pickled brains, warped spines, infant skeletons, cabinets of teeth, all manner of internal organs, and foetuses ranging from nine weeks to full term. It is a truly ghoulish spectacle.

Hunter claimed to have “dissected some thousands” of human corpses. A small number did consent; the Georgian upper classes were warming to the idea of donating their bodies for scientific enquiry. An Archbishop of Canterbury, several military leaders and a serving prime minister (the Marquess of Rockingham) were among those who volunteered for Hunter’s knife.

But the vast majority who ended up in 18th-century anatomy theatres had no say in the matter. Some were wrestled away from their families beneath Tyburn Tree, the gallows in Hyde Park where dozens of criminals were hanged every year. Others were acquired through the bribing of undertakers. Most commonly though, they were stolen from their graves by gangs of professional body snatchers. Hunter himself almost certainly robbed graves in his youth, when he spent 12 years learning the ropes at his brother’s anatomy school.

The grim provenance of Hunter’s collection is addressed only in a brief wall text at the museum. Acknowledging the specimens were gathered “before modern standards of consent”, it states: “We recognise the debt owed to those people… who in life and death have helped to advance medical knowledge.” Why, then, has the display of Egyptian mummies come to be regarded as a sensitive problem, but less so the display of an unborn child probably removed from the womb of a stolen corpse?

One reason is simply that the humanity of the dead only becomes an issue when someone makes it an issue. The controversy over mummies, for instance, reflects a particular convergence of political beliefs: some modern Egyptians, not to mention the modern Egyptian state, are now identifying as descendants of the ancient civilisation on the Nile. At the same time, Western curators have become desperate to distance themselves from the colonial period during which these objects were acquired. By contrast, there are few people in Britain who feel so strongly about scores of impoverished Londoners pulled from their shallow graves in the dead of night.

But there is another important difference. The claim that Hunter’s activities “have helped to advance medical knowledge” is a powerful one, linking his specimens with the achievements of modern medicine. It is also clearly true. Without a legal way to acquire bodies — and with religious beliefs making voluntary dissection unthinkable to many — only stolen corpses could produce the beginnings of the anatomical knowledge that we take for granted today. The museum subtly emphasises this by charting the development of surgery from the early-modern period to our own time: rather dull after the horror show of Hunter’s collection, but that’s the point I suppose.

Charles Byrne’s skeleton might be too controversial to display, but the museum has insisted on keeping it due to its medical value. It helped an American neurosurgeon to identify pituitary gigantism in 1909, and a century later, allowed scientists to find a genetic component in the growth disorder.

What all of this points to is the special status of medical science in Western countries today. Museums and other cultural institutions are increasingly critical of the heritage they embody because, ultimately, they no longer believe it has served a positive purpose that could mitigate the brutality of the past. This goes far beyond the problem of human remains; as Guardian critic Jonathan Jones notes about Tate Britain’s recent guilt-laden rehang: “Maybe it doesn’t want to promote British art, for it seems to disapprove of much of it.” Yet there are not many people arguing that we should abandon the benefits of modern medicine since it, too, has a disturbing history. This is one area where progress is still understood as building on the past rather than overturning it: the only acceptable agenda for healthcare is more and better.

But Hunter’s collection also reveals a deep tension in the way we value medical science. If we consider it dehumanising to display body parts in jars, it is partly because we now struggle to recognise blood and tissue as human. Our technical mastery over biology has led to our alienation from it. Just as we expect our meat to arrive immaculately packaged in the supermarket, carrying no trace of the abattoir, so we banish birth, illness, and death from our everyday lives, consigning them to the clinical world of the hospital. We have never been more preoccupied with the condition of our bodies, yet we don’t like to see those bodies for what they really are.

The Uberisation of Everything

This article was written for my regular newsletter, The Pathos of Things, in May 2023. Read more and subscribe here.

Though originally designed in the 1960s, the Boeing 737 is an unsung hero of the early 21st century. A flimsy-looking airplane, its cramped conditions and unnerving tendency to rattle on take-off have driven many a short-haul passenger to the brink of panic. Yet it has delivered millions of Europeans to their weekend breaks at a price that makes the humiliations of Ryanair seem bearable.

The 737 is a symbol of what Janan Ganesh dubbed “the middle-class world citizen,” an ordinary consumer who enjoys a degree of mobility and convenience once reserved for wealthy. It was the forerunner of many other designs distinctive to our age: the Uber app, the bright turquoise Deliveroo uniform, the Air BnB review system, the Amazon distribution warehouse. This ecosystem of mostly platform-based services has made luxury seem cheap.

But as Ganesh noted in 2019, it has also long felt unsustainable. Some of these businesses have rarely, if ever, turned a profit, and appeared to survive only thanks to ultra-low interest rates. Many, likewise, seemed to be inviting a political backlash, exploiting underpaid gig workers and regulatory loopholes to undercut older business models.

Four years later, after a pandemic and several economic crises, the convenience economy appears more resilient than expected. Ryanair has just placed an order for three hundred new Boeing 737 Max-10s, the largest in its history, with which it hopes to eventually transport 300 million passengers per year. And you might soon be booking your flights with Uber, which is steadily expanding across the travel market in its bid to become a “super app.” Even Deliveroo has found enough savings to report revenue growth, despite a drop in takeaway orders.

But none of this means we’re heading for the frictionless lifestyle that was the dream of the 2010s. The terrible genius of the convenience economy is that it can continue expanding even as it becomes less and less convenient, simply because we can’t afford any alternative. And increasingly, we may find the reason we can’t afford an alternative is that the forces of Uberisation are eroding our incomes too.  

An Amazon fulfilment centre: one of the distinctive designs of our age. (Source: Bloomberg)

Budget services have been dragged down by labour shortages, inflation, and a loss of investor patience. Short-haul air tickets are around twenty to thirty percent more expensive than last year, and with new environmental regulations looming in Europe, it’s safe to say they will never be as cheap as they were. Uber’s fares keep rising, even as wait times have grown. Netflix is cracking down on account sharing and has introduced ads. But these companies remain cheapest option available, and they have reshaped our expectations enough that giving them up seems unthinkable. So they remain viable and, in some cases, very profitable.

Affordable luxuries have a deeper cost though. The digital platform companies work on the principle that if you make services cheap enough to capture a big market share, workers and producers will need you as a middle-man to reach consumers, even if this means accepting low pay and minimal job security. It’s a race to the bottom, forcing competitors to adopt the same tactics to keep up.

See the humbling of food delivery challenger Just Eat. Two years ago its CEO slammed gig employers for creating “precarious working conditions across Europe, the worst seen in a hundred years.” Unable to compete with Deliveroo and Uber Eats, it has now been forced to adopt the same model.

New research by the University of Bristol claims that more than half of UK gig workers earn below the minimum wage, with even larger numbers reporting anxiety and insecurity. Little wonder the convenience economy struggles to find enough of them to meet demand.

Even if people dislike being served by an underclass of exploited workers, I’m not sure they have fully appreciated the risks of allowing such conditions to be normalised. Employers everywhere are looking for efficiency savings and hedges against economic uncertainty. Taxi rides and parcel deliveries are not the only jobs that can be broken up into discreet tasks and advertised to workers only when necessary. These trends can spread into other industries too.

As the Times reported, a recent sharp drop in full-time positions is being matched by a rise in temporary contracts, effectively white-collar gig employment. In the EU, almost forty percent of workers under the age of thirty are in temporary positions already, and employment in general has become more flexible with fewer benefits. Unions in the United States have warned about the Uberisation of nursing, with shifts at some hospitals being booked through apps. Freelance professionals are finding themselves part of an increasingly competitive and global labour pool that keeps their fees down.

But all of this, perversely, will only benefit the likes of Uber and Amazon. As our time and disposable incomes are squeezed, their low-cost luxuries will become more appealing than ever, even if these are not actually as low-cost or luxurious as they used to be.

The roots of this situation go back beyond even the arrival of budget airlines. The decisive moment was the globalisation shock of the 1990s, which accelerated the offshoring of jobs to cut labour costs, and delivered in return a flood of cheap products from overseas. Access to an ever-expanding range of goods and services became the aspiration by which we measure our quality of life. For that we have traded, consciously or not, the principle that employment should entail a reasonable degree of security. 

The growth of the convenience economy, with its budget trips to the continent aboard a Boeing 737, made this bargain seem like a good one. But new opportunities are quickly taken for granted, whereas stagnation doesn’t lose its bitterness. The cheapest option from Amazon; cattle-class on Ryanair; a take-away because you are too exhausted to cook: these things will feel rather different if they become the limit of what we can hope for.

Christopher Wren: Godfather of the Technocrats

This article was originally published by Unherd in February 2023.

“You have, ladies and gentlemen, to give this much to the Luftwaffe: when it knocked down our buildings, it didn’t replace them with anything more offensive than rubble. We did that.” That was the famous barb which the then-Prince Charles aimed at modern architecture in 1987. They were not idle words: thanks to this speech, the architect Richard Rogers lost his opportunity to redesign an area next to St Paul’s cathedral, the 17th-century Baroque masterpiece by Christopher Wren.

It was not the last time the curmudgeonly prince would intervene against Rogers, and nor was it the last time Rogers would have to make room for St Paul’s. His skyscraper on Leadenhall Street, known as The Cheesegrater, owes its slanting profile to rules that protect certain views of Wren’s cathedral.

As we mark the 300th anniversary of Wren’s death, it’s worth noting a certain irony in all of this. St Paul’s may well be the nation’s favourite building, and Wren our greatest architect, but he is more like the grandfather of Richard Rogers than his antithesis.

This will strike some as blasphemous: Rogers’ buildings, which include the Centre Pompidou in Paris and London’s Millennium Dome, are technocratic in the most profound sense of the word. With their machine-like forms and conspicuous feats of engineering, they elevate efficiency and expertise into an idol to be worshipped, a religion in its own right. Impressive as these structures are, one can understand why they would make Charles despair whether “capitalism can have a human face, instead of that of a robot or a word processor”.

But technocratic monuments emerge from technocratic societies, where a fixation with how things work drowns out the question of what they are actually for. The natural and human worlds are treated as processes to be managed, with politics reduced to measurable outputs: higher growth, fewer emissions, a more equal distribution of benefits. Technology is held in awe, but more for its functional qualities than any greater purpose it serves.

This could be a description of our own society, and Rogers’ buildings an honest reflection of it. But these tendencies did not appear from nowhere; they have deep roots in the evolution of modern forms of knowledge and power. And those roots lead us back to Christopher Wren.

Picture the scene: it’s 1656, in a house on Oxford High Street. Three members of what will become the Royal Society, an engine of the scientific revolution, are debating whether poison can be administered directly into the bloodstream. Wren, a 23-year-old Fellow of All Souls College, thinks it can. But how to be sure? One of his accomplices, the pioneering chemist Robert Boyle, duly provides a dog, and the experiment begins. Using various improvised instruments, including a syringe fashioned from a goose quill and pig’s bladder, Wren makes an incision in the panicked dog’s leg, successfully injecting opium into a vein.

This episode was later recognised as the first recorded use of an intravenous anaesthetic. (The drugged dog, Boyle reports, recovered after being whipped around the garden for a while.)

It was experimental science like this that occupied Wren for the first two decades of his adult life, not architecture. He was mainly an astronomer, though he pursued all manner of experiments. He studied Saturn’s rings and the lenses of a horse’s eye; he investigated atmospheric pressure and dissected the human brain; he removed the spleen of a puppy and presented the king with a model of the moon.

This was the birth of modern science, but it was also the critical moment when the question of how began to displace the what and the why. These early scientists were driven by a growing realisation that our intuitions did not grasp the true structure of reality. Their response, at least in England, was to turn away from the philosophical mission of understanding what existence is and what its purpose should be, focusing more narrowly on how natural processes work. Already this shift was reflected in a new emphasis on technology: instruments to measure what the senses could not, and gadgets to show mastery of scientific principles. The patterns of the cosmos were mapped in the objective language of mathematics.

This outlook would eventually lay the ground for the technocratic management of society. And Wren, when he turned to architecture, was very much the technocratic type. Employed as Surveyor of the King’s Works, he was a talented but remote civil servant. After the Great Fire of 1666, his job was to Build Back Better, something he managed very well thanks to his relish for bureaucracy and admin. He even wanted to redesign the City of London in the form of a grid. Wren’s classical style did not reflect a deep connection with artistic tradition; it was, as the historian John Summerson pointed out, a matter of good grammar.

In fact, though Wren designed 56 London churches (of which only 23 remain), his readings of scripture suggest an almost Richard Dawkins-like literalism in regard to religion. He liked to demonstrate how astronomy could explain biblical miracles, and treated Samson’s ability to pull down temples as a maths problem. One of his admiring biographers sums up his mindset perfectly: “How a thing worked — whether that thing was a spleen or a comet, a palace or a government department — was as important to him as the end product.”

What really made Wren tick was impressive engineering, mathematical rigour, and finding the most economic solution to a practical puzzle. This is already evident in his first building, the Sheldonian Theatre in Oxford, where he devised an ingenious system of trusses to cover a lengthy span. His London churches show his love of geometry — “always the True test” of beauty for Wren — as seen in their centralised plans, barrel-vaulted ceilings, and their pursuit of symmetry wherever possible.

It’s not entirely a coincidence, then, that both Christopher Wren and Richard Rogers built domes near the Thames. When Wren proposed this idea for St Paul’s, it was, like Rogers’ fiberglass egg, something alien to London, as well as a state-of-the-art engineering display.

It’s true that Wren’s churches don’t feel overly stringent or rational, and this is a sign of his incredible intellect. Their precise balance of spatial and decorative elements can, at their best, create a sense of serene harmony. Nonetheless, later generations often found an artificial quality in his buildings; one 18th-century poet thought him “extremely odd, / To build a playhouse for the church of God”. It’s difficult to judge his work today because it is now contrasted with a modern architecture that fetishises pure geometry, technology, and the expression of structural principles. Yet these are ideas which Wren himself anticipated.

More than that, Wren and his fellow scientists unleashed a quest for knowledge which eventually created a world so complex it needs technocrats to manage it, as well as architects like Rogers to glorify it. It is a great irony that the designer of Britain’s most beloved cathedral also laid the foundations for a society so starved of spiritual meaning.