How Habitat Made Britain’s Middle Class

This essay was originally published by Unherd in May 2024.

An elegantly dressed woman is polishing her nails, looking into the camera with a kind of feline arrogance. Before her on the dressing table lies a beautiful pair of hairbrushes, while in the background a young man is making the bed, straightening the duvet with a dramatic flick. This photograph appeared in a 1973 catalogue by Habitat, the home furnishing shop founded by Terence Conran. It gives us a sense of the brand’s appeal during its heyday. The room is stylish but comfortable, the scene full of sexual energy. This is a modern couple, the man performing a domestic task while the woman prepares for work. The signature item is the duvet, a concept Habitat introduced to Britain, which stood for both convenience and cosmopolitan style (Conran discovered it in Sweden, and called it a “continental quilt”).

As we mark Habitat’s sixtieth birthday, all of this feels strangely current. Sexual liberation, women’s empowerment and the fashionable status of European culture are still with us. The duvet’s victory is complete: few of us sleep under blankets or eiderdowns. But most familiar is how the Habitat catalogue wove these products and themes into a picture of a desirable life. It turned the home into a stage, a setting for compelling and attractive characters. This is a species of fantasy we now call lifestyle marketing, and we are saturated with it. Today’s brands offer us prefabricated identities, linking together ideals, interests and aesthetic preferences to suggest the kind of person we could be. It was Habitat that taught Britain to think and dream in this way.

The first shop opened on London’s Fulham Road in 1964, a good moment to be reinventing the look and feel of domestic life. New materials and production methods were redefining furniture — that moulded plastic chair with metal legs we sat on at school, for instance, was first designed in 1963. After decades of depression, rationing and austerity, the British were enjoying the fruits of the post-war economic boom, discovering new and enlarged consumer appetites. The boundaries separating art from popular culture were becoming blurred, and Britain’s longstanding suspicion of modern design as lacking in warmth and comfort was giving way. Habitat combined all of these trends to create something new. It took objects with an elevated sense of style and brought them down to the level of consumerism, with aggressive marketing, a steady flow of new products and prices that freshly graduated professionals could afford.

But Habitat was not just selling brightly coloured bistro chairs and enamel coffee pots, paper lampshades and Afghan rugs. It was selling an attitude, a personality, a complete set of quirks and prejudices. Like the precocious young Baby Boomers he catered for, Conran scorned the old-fashioned, the small-minded and suburban. And he offered a seductive alternative: a life of tasteful hedonism, inspired by a more cultured world across the channel. Granted, you would never fully realise that vision, but you could at least buy a small piece of it.

No one has better understood that strand of middle Britain which thinks of itself as possessing a creative streak and an open mind. The Habitat recipe, in one form or another, still caters to it. Modern but classic, stylish but unpretentious, with a dash of the foreign: this basic approach underpins the popularity of brands from Zara Home to Muji. It has proved equally successful in Conran’s other major line of business, restaurants: see Côte, Gail’s Bakery or Carluccio’s (co-founded by Conran’s sister Priscilla). To one degree or another, these brands all try to balance a modicum of refinement with the reassurance that customers won’t feel humiliated when they examine the price tag.

Yet there was always something contradictory about this promise of good taste for the masses. In Britain, influential movements in design have been inspired by a disdain for vulgar, mass-produced goods since the Industrial Revolution. Conran liked to cite the great craftsman and designer William Morris — “have nothing in your houses that you do not know to be useful or believe to be beautiful” — but Morris famously detested factory-made products. From the Thirties, proponents of modern design despaired at the twee aesthetics and parochial norms of petit-bourgeois life in the suburbs. The fashionable culture of the Swinging Sixties, Conran’s own milieu, likewise defined itself against the conventional majority. This was the era of John Lennon and the Rolling Stones after all.

In his outlook and his commercial ambitions, Conran tried to ignore such tensions: good design should be available to everyone. But they have inevitably come back to the surface. With the rise of Asian manufacturing, passable copies of classy or arty products are now as widespread as any other; think mass-produced ceramics that imitate artisanal imperfection. Similarly, successful Habitat-like brands have acquired corporate managers who force them to expand. Even an apparently exclusive institution such as Soho House, the private members’ club for wealthy creatives, is now a globe-spanning lifestyle brand with locations in dozens of cities and its own line in cosmetics, furniture and workspaces. These trends have made Conran’s vision of life appear increasingly hollow, because even in the absence of snobbery, it relied on a sense of originality, individuality and artistic inspiration. Such qualities are difficult to find when a product suddenly graces every living room and Pinterest board.

These same contradictions doomed Habitat itself. In the late-Eighties, Conran’s appetite got the better of him, and a botched effort to incorporate two other firms led to his ejection from the company. After 2000 the brand rarely made a profit, as it was passed along by a series of retail giants, including Ikea, Argos and Sainsbury’s. Like so much that was fresh and subversive in the Sixties, Habitat was absorbed by the mainstream, its lively identity reduced to a market segment and subject to the demands of accounting. Its famous shops were trimmed down to a handful of showrooms, and last year those closed as well. Today it is little more than the husk of a brand — a slightly upmarket, design-conscious Ikea — condemned to the purgatory of online retail, where every competitor has its endless thumbnail images of seemingly identical products.

A more serious problem is that, while we now have an overabundance of style, the “life” side of the equation has become increasingly sparse. The Boomers buying continental quilts were a generation on the up. They could plausibly imagine themselves moving towards the spacious and leisurely domestic life that Conran dangled before them. Most of those young professionals who entered work after 2008, by contrast, know they will never stack their French crockery in a French holiday home; they would be happy with a modestly sized apartment. So aspiration does not really capture the appeal of lifestyle consumerism for these embittered millennials. It is more a question of consolation, or escapism, or a desperate attempt to distinguish themselves from the mass market where they know they belong.

Then again, it increasingly feels like the whole notion of lifestyle was a recipe for dissatisfaction to begin with. Habitat emerged at a moment when traditional roles and social expectations were melting away; in their place, it proposed the idea of life as a work of art, an exercise in self-fashioning, with commodities and experiences guiding consumers towards a particular model of themselves. Today, with all the niches and subcultures spawned by network technology, there is no shortage of such identities on offer. If you like outdoor activities, you may find a brand community that combines this with certain political views and a style of fashion. If you like high-end cars, you might dream of occupying a branded condo in Miami or Dubai.

But these lives assembled from images remain just that: a collection of images, a fiction that can never fully be inhabited. It seems the best we can do is represent them in the same way they were presented to us, as a series of vignettes on Instagram, where the world takes on a idealised quality that is eerily reminiscent of those Habitat catalogues from decades ago. One gets the impression that we are not trying to persuade others of their reality so much as ourselves.

The Sublime Hubris of Tropical Modernism

This review was originally published by Engelsberg Ideas in April 2024.

In December 1958 an All-African People’s Conference was held in Accra, capital of the newly independent Ghana. It brought together delegates from 28 African countries, many of them still European colonies. Their purpose, according to Ghanaian prime minister Kwame Nkrumah, was ‘planning for a final assault upon Imperialism and Colonialism’, so that African peoples could be free and united in the ‘economic and social reconstruction’ of their continent. Above the entrance of the community centre where the conference took place, there was a mural which seemed to echo Nkrumah’s sentiment. Painted by the artist Kofi Antubam, it showed four standing figures along with the slogan: ‘It is good we live together as friends and one people.’

The building was a legacy of Ghana’s own recent colonial history. During the 1940s the UK government’s Colonial Development and Welfare fund had decided to build a number of community centres in what was then the Gold Coast. Most of the funding would come from British businesses active in the region, and the spaces would provide a setting for recreation, education and local administration. The Accra Community Centre, neatly arranged around two rectangular courtyards with colonnaded walkways, was designed by the British Modernist architects Jane Drew and Maxwell Fry. Antubam’s mural calling for amity reads somewhat differently if we consider the circumstances in which it was commissioned. The United Africa Company, the main sponsor of the project, was trying to repair its public relations after its own headquarters had been torched in a protest against price fixing.

The Accra Community Centre is emblematic of the ambiguous role played by Modernist architecture in the immediate post-colonial era. Like so many ideas embraced by the elites of newly independent states, Modernism was a western, largely European doctrine, repurposed as a means of asserting freedom from European rule. ‘Tropical Modernism’, a compelling exhibition at London’s V&A, tries to document this paradoxical moment in architectural history, through an abundance of photographs, drawings, letters, models and other artefacts.

Drew and Fry are the exhibition’s main protagonists, an energetic pair of architects who struggled to implement their vision in Britain but had more success in warmer climes. In addition to the community centre in Accra, they designed numerous buildings in West Africa, most of them educational institutions in Ghana and Nigeria. In the course of this ‘African experiment’, as Architectural Review dubbed it in 1953, they developed a distinctive brand of Modernism, of which the best example is probably Ibadan University in Nigeria. It consisted of horizontal, geometric volumes, often raised on stilts, with piers running rhythmically along their facades and, most characteristically, perforated screens to guard against the sun while allowing for ventilation.

On the basis of this work, Drew and Fry were invited to work on the planning of Chandigarh, the new capital of the state of Punjab in India, which had just secured its own independence from Britain. Here they worked alongside Le Corbusier, the leading Modernist architect, on what was undoubtedly one of the most influential urban projects of the 20th century. Drew and Fry also helped to establish Tropical Architecture courses at London’s Architectural Association and MIT in Massachusetts, where many architects from post-colonial nations would receive training.

Not that those students passively accepted what they were taught. The other major theme of the exhibition concerns the ways that Indian and Ghanaian designers adopted, adapted and challenged the Modernist paradigm, and the complex political atmosphere surrounding these responses. Both Nkrumah and Jawaharlal Nehru, India’s first prime minister, preferred bold and bombastic forms of architecture to announce their regimes’ modernising aspirations. This Le Corbusier duly provided, with his monumental capitol buildings at Chandigarh, while Nkrumah summoned Victor Adegbite back from Harvard to design Accra’s Black Star Square. In India, however, figures such as Achyut Kavinde and Raj Rewal would in the coming decades forge their own modern styles, borrowing skilfully from that country’s diverse architectural traditions. At Ghana’s own design school, KNUST, it was the African American architect J Max Bond who encouraged a similar approach to national heritage, telling students to ‘assume a broader place in society, as consolidators, innovators, propagandists, activists, as well as designers’.

As is often the case, the most interesting critique came not from an architect, but an eccentric. In Chandigarh, the highway inspector Nek Chand spent years gathering scraps of industrial and construction material, which he secretly recycled into a vast sculpture garden in the woods. His playful figures of ordinary people and animals stand as a kind of riposte to the city’s inhuman scale.

One question raised by all of this, implicitly but persistently, is how we should view the notion of Modernism as a so-called International Style. In the work of Drew, Fry and Le Corbusier it lived up to that label, though not necessarily in a good way. Certainly, these designers tried diligently to adapt their buildings to new climatic conditions and to incorporate visual motifs from local cultures. In light of these efforts, it is all the more striking that the results still resemble placeless technocratic gestures, albeit sometimes rather beautiful and ingenious ones. We could also speak of an International Style with respect to the ways that these ideas and methods spread: through evangelism, émigrés and centres of education. It’s important to emphasise, which the V&A show doesn’t, that these forms of transmission were typical of Modernism everywhere.

By the 1930s, Le Corbusier was corresponding or collaborating with architects as far afield as South Africa and Brazil (and the latter was surely the original Tropical Modernism). Likewise, a handful of European exiles, often serving as professors, played a wildly disproportionate role in taking the International Style everywhere from Britain and the US to Kenya and Israel.

If Modernism was international, its Tropical phase shows that it was not, as many of its adherents believed, a universal approach to architecture, rooted in scientific rationality. Watching footage at the exhibition of Indian women transporting wet concrete on their heads for Chandigarh’s vast pyramids of progress, one is evidently seeing ideas whose visionary appeal has far outstripped the actual conditions in the places where they were applied. As such, Modernism was at least a fitting expression of the ill-judged policies of rapid, state-led economic development that were applied across much of the post-colonial world. Their results differed, but Ghana’s fate was especially tragic. A system where three quarters of wage earners worked for the state was painfully vulnerable to a collapse in the price of its main export, cocoa, which duly came in the 1960s. Nkrumah’s regime fell to a coup in 1967, along with his ambitions of pan-African leadership and the country’s Modernist experiment. Those buildings had signified ambition and idealism, but also hubris.

Take a Seat

This essay appeared in my regular newsletter, The Pathos of Things, in March 2024. Subscribe here

This week I was lucky enough to attend a symposium at the Royal College of Art, on a subject that is close to my heart – and even closer to my rear: the chair. This is one of those objects that is both extremely ordinary (are you sitting on one now?) and freighted with all kinds of social significance. Natalie Dubois of Utrecht’s Centraal Museum, a speaker at the symposium, pointed to the longstanding link between chairs and power, encoded in language. Can you secure a seat at the table? Or will you be dethroned? Who will win the most parliamentary seats? Better ask the chairman. On the other hand, these can be very intimate objects. Few images represent absence as viscerally as an empty chair.

Designers, like monarchs and emperors, have long shown a peculiar interest in chairs. Normally prestige flows towards things that are very large (buildings and monuments) or very expensive (precious materials and intricate workmanship). But to judge by the results, neither of these criteria can explain why so many prominent architects have tried to stamp their genius on the chair, from Charles Rennie Mackintosh and Gerrit Rietveld to Mies van der Rohe and the Smithsons. Perhaps the reason is that, as the RCA’s Alon Meron suggested, a chair is not just an object but a space – an engineered structure and a sculptural negative of the human body. As such, the chair lends itself to the concentrated expression of architectural style.

To show the on-going association of chairs with power, Dubois recalled the infamous snub of Ursula von der Leyen, president of the European Commission, by Turkish premier Recep Erdoğan in 2021. At a diplomatic event in Ankara, a chair was provided for Erdoğan and for Charles Michel, another EU politician, but not for von der Leyen, who was left standing awkwardly at the side. She reluctantly sat down on a couch opposite the Turkish foreign minister, an arrangement seemingly intended to humiliate. In truth though, there are few situations today when chairs possess such gravity. The old codes dictating who can sit and who must stand belong to a traditional understanding of authority and deference, one that offends the modern mind. What remains is largely a matter of body language. There are moments when sitting down uninvited feels inappropriately relaxed, like swearing or lighting a cigarette.

If sitting no longer conveys the authority it once did, it might also be because most of us do it all day. Modernity has been, among other things, a revolution in posture, as a growing portion of the population completes the journey from the fields via the factory into a chair. The symbol for bureaucratic labour has always been the desk – that is, the bureau – but these objects are symbiotically connected (you generally don’t stand at a desk). Today the chair is part of a functional apparatus that includes the table, the laptop, the human body and the coffee cup. That may sound facetious, but in the early twentieth century, when people were still needed for tasks like copying, filing and computing, significant attention was paid to the most efficient way of seating a worker. From the perspective of “scientific management,” a typist and her chair were part of a single productive mechanism. The same is true for me, except that I’m free to sit in an uncomfortable chair if I wish.

The point is that, in a sedentary world, a chair is as likely to represent confinement, boredom and inertia as power or status. We go to great lengths to ensure we escape our chairs at least occasionally, lest we develop back problems or depression. An “active lifestyle,” once an obligation for the vast majority, is the real luxury now. Then again, when I’ve finished this sentence I will probably just move to the sofa.

Mourning the Biosphere

This essay appeared in my regular newsletter, The Pathos of Things, in December 2023. Subscribe here

Today more than ever, there is a strong dose of fantasy in the British dream of home ownership. The ideal house is no longer a comfortable dwelling and a way of accruing capital; it is a sandbox to be customised so that it reflects a personal vision of domesticity. This is nicely illustrated by the winner of the Royal Institute of British Architects’ house of the year award, announced last week. What began as a detached house in a Tottenham alleyway has been transformed into an airy interior courtyard, inspired by a traditional Moroccan riad.

The most striking feature of this enchanted space is what RIBA calls its “biophilic design,” meaning that it is teeming with plants. The front of the building is buried behind a lush screen of bamboo leaves, while the living room is a “domestic greenhouse” looking onto a no less verdant garden.

The biophilic environment is part of the zeitgeist today. It is not new of course; Le Corbusier introduced the roof garden to the repertoire of Modernism back in the 1920s, while the glass-walled conservatory with its pot plants, radio and reading chair has long been a domestic institution in Britain. But in recent years it has become fashionable to douse spaces in so much greenery that they begin to resemble garden centres.

Since around 2017, a whole genre of magazine articles has emerged to puzzle over the millennial “obsession” with houseplants. They cite testimony from fanatical young plant-lovers, and report on Instagram subcultures where every apartment has become a jungle of Monstera Deliciosa and Dracaena Trifasciata. Meanwhile no digital rendering of a new housing estate or office building is complete without at least a token smattering of trees. At Google’s new HQ in London, 40,000 tonnes of soil have been carted up to the roof to lay a garden along its entire 300-metre span.

Then there are the urban greening schemes. Manhattan got its High Line park a decade ago, and London tried to go one better with proposals for a dramatic Garden Bridge across the Thames (predictably scuppered by runaway costs and planning issues). In Paris, mayor Anne Hidalgo plans to turn the Champs-Élysées into “an extraordinary garden,” while Saudi Arabia’s imagined cities of the future show forests growing in the desert.

On its face, this looks like a straightforwardly positive trend, and not an especially mysterious one. The sublime beauty of the plant kingdom – the beauty of stem and leaf, fruit and flower, root and tendril – is one of the few aesthetic facts that does not need to be relativised. It sings out from artistic and ornamental traditions around the world. A garden is a profoundly human space because nothing so poetically captures the astonishing fact which alone underwrites our existence: that there are things which live and blossom. As for modern life, the company of such things is probably the simplest way to ease the burden of alienation from nature. There is plenty of evidence that plants make us happier and healthier in various ways, but then few people need to be told this before appreciating their presence.   

Subscribed

And yet, there is also an undercurrent of sadness here. Like many things which have ended up under the dubious headings of “wellness” and “self-care,” plants are not just a tonic for 21st century life; they are a symbol of its painful shortcomings and a distraction from its consequences. Accounts of the plant craze among twenty- and thirty-somethings always come back to the same set of explanations, often given by the biophiles themselves. Plants soften the impersonal feeling of rented apartments; they provide an outlet for nurturing instincts at a time when family formation feels impossible; tending to them is a way of escaping the frantic, ephemeral experience of digitised life; and above all, for people who are always moving around and working long hours, a pet that lives in a pot doesn’t require a significant commitment, and can ultimately be abandoned.  

In other words, the houseplant belongs to that increasingly common pattern of existence where good things take the form, not of simple pleasures, but of consolation for deeper deficiencies. My own experience is a tragicomic variation of the same theme. I would probably trade one of my kidneys for a garden, but in lieu of that, I’ve been amassing a small collection of botanical encyclopaedias. I lovingly study the illustrations, try to remember the names of the species (Latin and colloquial), and sometimes look up the histories of their discovery and cultivation. Yet I’ve moved house so many times in recent years that, at some point, I lost the habit of actually owning plants.

Something similar can be seen at the level of buildings and cities, where the appeal of urban oases and hydroponic facades must stem in part from collective feelings of guilt and regret. Even as our economic activities turn the planet into a vast toxic dump, we long to make our cities into shrines for worshipping the biosphere, or perhaps mourning would be a better term. Of course urban greenery can represent something less sincere, namely, a convenient fiction adopted by a highly destructive building industry. Still, it is a fiction we eagerly accept.

Plants cleanse our air and minds, but no less important, they allow us to stage the fantasy of a reconciliation with nature. That is a role they are increasingly called to perform in public spaces as well as in award-winning London houses. 

Aby Warburg and the Memory of Images

This essay was originally published by Engelsberg Ideas in November 2023.

In June 2020, everyone was ‘taking the knee’. Protests had erupted in the United States and spread to dozens of countries around the world, after the killing of an African American man, George Floyd, by police in Minneapolis. The act of kneeling on one leg, en masse, quickly became one of the most resonant symbols of opposition to police brutality and racial discrimination. Crowds gathered to perform the gesture in London’s Hyde Park, outside the Palais de Justice in Paris, on the streets of Sydney and before the US consulate in Milan. In Washington, nine Democrat members of Congress donned West African scarves and kneeled before cameras in the Capitol Visitor Center. Canadian Prime Minister Justin Trudeau observed the ritual, as did members of the National Guard in Hollywood and Metropolitan Police in Westminster. In Britain and beyond, ‘taking the knee’ became part of pre-match proceedings in professional football.

Just three years later, it is difficult to recount how dramatic these events were. Without any planning or coordination – at a time, in fact, when many countries had social distancing measures in place due to the Covid pandemic – a global protest movement had sprung into existence virtually overnight. It was clear that iconography, striking visual gestures and symbols broadcast through social media, had been part of the mechanism that made it happen.

Where did ‘taking the knee’ come from? The American Football player Colin Kaepernick, who began using the pose as a protest during national anthems in 2016, is credited with inventing it. Yet the gesture is freighted with older meanings. Open a book of Christian iconography and you will see kneeling men and women across the ages, symbolising devotion, humility and respect. Even in relation to the cause of racial equality, the pose has – in the ambiguous phrase used by numerous press articles – ‘a long history’. At the bottom of the Wikipedia page for ‘Taking the Knee’, for instance, there is a curious section noting two precursors. In February 1965 the Reverend Martin Luther King Jr led a group of civil rights protestors in praying on one knee outside Dallas County courthouse. Meanwhile, more than a century earlier, the image of a kneeling African man had been a popular symbol of the Evangelical movement for the abolition of slavery.

What should we make of these precedents? None of the publications that noted them suggested any actual link with Kaepernick or the later protests. Yet I could not dispel the feeling that, in the summer of 2020, the past was speaking through those crowds of kneeling people. We are today so immersed in media that a sense of imitation, of the re-enactment of some earlier episode, hangs over every public act. Even the violent scenes that accompanied those protests, the pulling down of statues and the torching of buildings, seemed to function as a kind of familiar iconography, symbols of revolution that had somehow sprung from our cinematic imagination into reality.

My attempts to make sense of all this brought me to a rather unexpected place. They brought me to the European Renaissance of the fifteenth and sixteenth centuries; or rather, to a vision of the Renaissance put forward by a singular thinker, the German art historian Aby Warburg. Though he died almost a century ago, Warburg thought about images in a way that seems highly prescient today. He recognised that the power of iconography lay in its potential for transmission across space and time, its ability to accrue new meanings in different contexts. He warned, moreover, that this power could be destructive as well as creative.

Warburg is probably best known today as a collector of books, and for good reason. Born in 1866, he was the eldest son of an illustrious Jewish banking family based in Hamburg and New York. As legend has it, the thirteen-year-old Warburg gave up his claim on the family business to his brother Max, asking only that he would never be short of money for books. By 1919 he had collected enough to establish the Library of Cultural Studies in Hamburg, a setting famous for its esoteric approach to classifying knowledge (one section was titled ‘Religion, Magic and Science’). Together with the philosopher Ernst Cassirer, whose work on symbolic forms drew heavily from this library, Warburg helped to make Hamburg one of the most innovative sites in the intellectual landscape of the Weimar Republic.

Warburg died in 1929, and four years later, after the Nazis came to power in Germany, two steamers evacuated his collection across the North Sea, carrying some 60,000 books, along with 15,000 photographs and thousands of slides. These riches ended up at what is today the Warburg Institute and Library on London’s Woburn Square. Above its entrance, you can see engraved in Greek capitals the idea that Warburg pursued throughout his career: MNHMOΣYNH, Mnemosyne, memory.

The most extensive exposition of Warburg’s ideas comes not from his writings but from a remarkable project called The Image Atlas of Memory  (Der Bilderatlas Mnemosyne), on which he was still working at the time of his death. It consists of nearly a thousand images, arranged on a series of 63 dark blue panels, tracing the evolution of visual expression from the ancient Greco-Roman world to the Renaissance and then into 1920s photojournalism. The principle behind this work, which would influence the philosopher Walter Benjamin among others, was that visual symbols and motifs are a form of cultural memory. As they are transmitted and adapted through time, they reveal a dialogue in which the past infuses the present and the present reinterprets the past.

Warburg’s inspiration for these ideas came from Renaissance Florence, a city he considered ‘the birthplace of modern, confident, urban, mercantile civilisation’. The Florentine elite – a milieu of great banking families, artists and private scholars, for which Warburg felt a strong affinity – had developed a new synthesis of medieval Catholicism and ideals adapted from classical Greece and Rome. Warburg was especially interested in the role of iconography in creating that synthesis. He observed that Renaissance artists had not simply been inspired by ancient sources; they had borrowed a distinct catalogue of expressive gestures, ways of representing the human body that communicated movement, energy and confidence. These borrowed gestures were like visual quotations, ‘migrant images’ drawn from the past to serve the expressive needs of the present. Warburg called such images ‘pathos formulas’, and he came to see them as the vehicles of cultural memory.

One such formula was a woman in swirling garments, ‘the Nymph’ as Warburg dubbed it, a figure he documented with more than fifty images in the Memory Atlas. A vivid example appears in Dominico Ghirlandaio’s fresco, The Birth of John the Baptist, where, on the furthest right of the painting, we see a female attendant carrying a plate of fruit with an almost absurd dynamism – an ancient figure literally bursting into the late-fifteenth century. Another formula was the chaotic melee of warriors, a scene portrayed in ancient sources, such as the Arch of Constantine, and imitated by numerous Renaissance artists down to Raphael’s own Battle of Milvian Bridge in the 1520s. Still another was the figure of the pagan goddess Fortuna, a female nude with a billowing sail, as seen in the family crest of the merchant Giovanni Rucellai, where it sat uneasily above a Christian coat of arms. Fortuna had come to symbolise an ethic of prudence, calculation, and trust in one’s own judgment; the Runcellai crest was echoed in a phrase that sometimes opened Medici business documents: Col nome di Dio e di Buonaventura, ‘In the name of God and of Good Fortune’.

The search for such patterns might seem out of place today, given the vast number of images created on a daily basis, but, as Warburg realised, mass culture is perfectly suited to the transmission of pathos formulas, since the most resonant images tend to be the most widely circulated and reproduced. In the Memory Atlas, photographs of antique coins and oil paintings sit alongside modern advertising imagery and magazine clippings. A quick glance at the images that populate our own media reveals countless expressive gestures that could be called pathos formulas: the ecstasy of footballers celebrating a goal, the stance of actors assembled on a film poster, the pose of models on the covers of men’s magazines, the wooden bearing of a politician shaking someone’s hand, or the conventions dictating how people present their bodies on Instagram.

Warburg’s notion of memory emphasises that, when we need to express or represent an idea, we draw from a stock of gestures and symbols whose meanings are kept alive in our culture by the presence of earlier representations. In this way, images can travel across centuries, being recalled repeatedly into new contexts and acquiring new layers of meaning. This is what we saw with the re-emergence of Christian body language as a way of galvanising protest. Only in light of its deeper history, its long passage through time, can we understand the various connotations of ‘taking the knee’ in 2020. When performed by figures of authority, for instance, it did not express defiance but contrition and repentance; an idea we could find in any number of Hollywood films, but which we could equally trace all the way back to images of medieval kings kneeling in penance to the church.

Warburg’s study of the Renaissance also brought him to reflect on darker aspects of iconography. Like his contemporary, the sociologist Max Weber, he understood the emergence of the modern world in terms of a ‘loss of magic’. (Weber’s term, Entzauberung, is normally translated as ‘disenchantment’). Warburg saw the Renaissance as a crucial moment in this process. It was a time when astrological beliefs and practices were still widespread, with the movements of planets and stars held to predict everything from natural disasters to the course of an illness and the qualities of a person’s character. Yet the era also saw a growing tendency to view the cosmos in terms of impersonal laws, which for Warburg signalled a movement towards greater creativity and reason.

Again, iconography had played an important role. Images had always been implicated in magic, mediating between human beings and the forces of the occult. They could serve as fetishes and idols – objects imbued with magical powers – or as superstitious forms of representation, as with astrological symbols and charts that projected the presence of terrifying deities onto the planets. According to Warburg, Renaissance artists undermined the psychology of magic by presenting mythology in a new way. Their style produced a ‘distance’ between the viewer and the image, a mental space that allowed astrological figures to appear not as active, daemonic forces, but as abstract ideas. Warburg writes of Albrecht Dürer’s portrayal of Saturn, for instance, that ‘the artist has taken a magical and mythical logic and made it spiritual and intellectual’, transforming a ‘malignant, child-devouring, planetary god’ into ‘the image of the thinking, working human being’.

The genius of the Renaissance, for Warburg, was that it could retrieve the energy of past eras, whether the pagan cults, which had created figures such as the Nymph, or the magical traditions associated with astrology, but could also interpret these influences in a way that was aesthetic and rational. Developments in Warburg’s own lifetime made him realise that the modern mind was not immune from relapsing into a primitive relationship with images. When the First World War broke out in 1914, he observed with horror the role of visual propaganda in fomenting hatred, and then suffered a lengthy nervous collapse punctuated by psychotic episodes. As he later wrote to his family, the war ‘confronted me with the devastating truth that unchained, elemental man is the unconquerable ruler of this world’. Warburg did not live to see the political spectacles of the Third Reich, though he did witness Italian fascism. He was at the Vatican in 1929 when Pope Pius XI signed an accord with Mussolini, quipping to a friend that he’d had ‘the chance of my life to be present at the re-paganisation of Rome’.

Likewise, many of the contemporary images Warburg chose for the Memory Atlas hint at the darkness underlying the sophistication of modernity, albeit in a somewhat ironic way. On the panel dedicated to Mars, the ancient god of war and astrological harbinger of violence, Warburg pinned images of the huge airships that captivated the German imagination during the 1920s. The mesmerising power of modern technology, he seems to be saying, threatens to collapse the critical space of reason, imbuing images with magic once again.

This loss of distance is another concept we could apply to our own visual culture. To a far greater degree than the photojournalism of Warburg’s time, internet media tends to diminish our ability to reflect on the feelings that images provoke. Designers are incentivised to make platforms immersive, not to mention addictive, through algorithms that manipulate the limbic system; we often now consume images in a state resembling a trance. More subtly, social media blurs the generic boundaries that, in older media like films and video games, distinguish between reality and unreality. Much of what passes across our screens now, regardless of whether it is genuine, curated or simply fake, appeals to our emotions with a sense of raw reality, of unmediated and authentic life. All of this encourages those impulsive responses – of anger, jealousy, fear or desire – that notoriously govern the online world.

We should not overstate the role of images by imagining they can somehow determine events in isolation from the conditions in which they appear. To suggest that a symbol or gesture exercises power over us regardless of context is itself a kind of magical thinking. Nonetheless, seen through a Warburgian lens, iconography is still a potent force. The events of the summer of 2020 seem especially significant because they illustrated two contrasting ways that emotion can be visually communicated and spread. It can be expressed through a rhetorical gesture like ‘taking the knee’, which is, whatever one thinks of the message, a meaningful political act. It can also – as we’ve seen on many other occasions in recent years – be channelled into contagious images of violence.

Nostalgia for the Concrete Age

This is my latest newsletter published at Substack. Read more and subscribe here.

Our forebears did not think as we do. In the late 1940s, the Soviet émigré Berthold Lubetkin served briefly as chief architect for Peterlee, a new town in England’s northeast coalmining region. The glorious centrepiece of Lubetkin’s vision? A highway carving through the middle of the town.

“Young couples,” he dreamed, “could sit on its banks watching the traffic, the economic pulse of the nation, with coal and pig iron in huge lorries moving south, while from the south would come loads of ice-cream and French letters.” (A French letter, in case you were wondering, is a condom).

Today this sounds vaguely dystopian, like a dark fantasy from the pages of J.G. Ballard. The English motorway, now a site of daily torture for thousands, is not easily romanticised. But the strange thing is that, if Elon Musk suddenly stumbled on some new mode of transport that made our roads obsolete, Lubetkin’s poetic view of asphalt and traffic would quickly resonate again. The M6 would become the subject of coffee table books.

That is the pattern we see with other architectural relics from the decades following the Second World War. Concrete is cool again. This week, I visited boutique fashion brand Margaret Howell, which is marking the London Festival of Architecture with a gorgeous photography exhibition on the theme of British cooling towers. These are the enormous, gracefully curving concrete bowls that recycle water in power stations. Except they are now disappearing along with the UK’s coal-fired power stations, and so the 20th Century Society is campaigning to save some of these “sculptural giants” for the nation’s heritage.

Elsewhere at the LFA, there is an exhibition celebrating gas holders, another endangered species of our vanishing industrial habitat. And this is all part of a much wider trend over the last decade. In that time, a number of histories have been published that try to rebut the negative view of Modernist architecture from the 1960s and 70s. As I mentioned in an earlier post, there was outrage among design aficionados when, in 2017, the Brutalist estate Robin Hood Gardens was demolished.

A candlelit dinner in Shropshire, UK, during the 1972 power cuts. In the background are the cooling towers of Ironbridge B power station. Image: David Bagnall.

The mania is still growing. In Wallpaper magazine’s recent list of exciting new architecture books – a good barometer of fashionable taste – there are more than ten which celebrate post-war Modernism and Brutalism.

I welcome this tenderness for a bygone age. We should save our finest cooling towers and council estates from the wrecking ball. Some of them are very beautiful, and certainly an important part of our history. But I am a romantic in these matters; I see just about any scrap of the past as the spiritual equivalent of gold or diamonds. The question is why creatives, a devoutly progressive bunch, have become so attached to the concrete age. They don’t show the same sympathy for the dominant tendency of any other period. Wallpaper does not promote eulogies for Georgian terraces or Edwardian monuments.

There is doubtless an element of épater la bourgeoisie here. Creatives like to kick against conventional taste, which has long regarded mass-produced housing as depressing and Brutalist buildings as an eyesore. Fashion goes where the unfashionable do not, which in the built environment means exposed concrete.

There are deeper reasons of course. They can be seen in Lubetkin’s utopian vision of the highway, a structure that brings workers the rewards their industry deserves. In Britain and elsewhere in western Europe, the post-war decades were a time of unusual commitment to equality, solidarity and social progress, the golden age of the welfare state. This “Spirit of ’45,” as Ken Loach’s sentimental film called it, is uniquely cherished by the British left in the same way the New Deal era is by the American one. 

Crucially, the architecture of the time is not only seen as embodying these ideals, but as embracing a bold, modern approach to form at the same time. It represents the dream that artistic virtue can dovetail with social virtue. This is most obvious in some of the ambitious housing, schools and municipal buildings designed between the 1950s-70s, but even the cooling towers are part of this story. They were born from the nationalisation of Britain’s electricity grid in 1948.

What makes this lost world feel relevant now is that it ended with the arrival of the neoliberal era in the 1980s, and opposition to neoliberal principles has defined progressive politics in recent decades. Champions of the post-war project never fail to mention that, while it was far from perfect and sometimes disastrous, at least there was a commitment to providing decent conditions for everyone. 

Gérard Grandval’s Les Choux de Créteil (1974), photographed by Nigel Green for the new book Brutalist Paris

Still, let’s not pretend this reverence for the past is just about finding inspiration for the future. Empathising with the hopes and dreams of a distant era, savouring its aesthetic flavour, feeling the poignance of its passing: there is a word for this combination of emotional acts. It is nostalgia.

Of course that word is a dirty one now. At the cultural level, it is associated with irrationality, resentment, and hostility to change. It is pinned on old people who vote the wrong way. But that, surely, explains the appeal of the concrete age. Thanks to its creative and political legacy, it provides cover for people of a progressive bent to indulge the nostalgic sentiments they otherwise have to suppress.

It’s unfortunate that such alibis are needed. Nostalgia is not a pathology; it is part of the human condition. A deep-felt sense of loss is the inevitable and appropriate response to the knowledge that all things in this world, good and bad, are transient. The pathos of that fact can be deflected and disguised, but it cannot, ultimately, be denied.

This is my latest newsletter published at Substack. Read more and subscribe here.

Dilemmas of Displaying the Dead

This essay was published by Unherd in June 2023

The last weeks of Charles Byrne’s life were nightmarish. Known as the Irish Giant, the seven-foot seven-inch man from Ulster had made his way in 1782 to London, where he earned money by exhibiting himself as a freak. By the end of that year tragedy was overtaking him. He was addicted to alcohol and suffered from the painful effects of a pituitary tumour in his brain, the cause of his gigantism. The accrued savings of his 22 years of life — around £700 — had been stolen in a Haymarket pub.

Even in this condition, Byrne was allowed no dignity. The city’s anatomy schools were eager to dissect his body as a scientific prize. Among these circling vultures, none was more determined than the aptly named John Hunter, eminent surgeon, anatomist, and collector of organic specimens both animal and human.

A horrified Byrne had already rejected Hunter’s offer to buy his corpse and, in a final, desperate bid to escape the surgeon’s saws, asked his friends to encase his body in lead and sink it in the English Channel after he died. But Hunter managed to pay for the cadaver to be secretly removed from its coffin and transported to his home in Earl’s Court. There he boiled it down to its bones and reassembled it as a skeleton. “I lately got a tall man,” he hinted to a friend some years after.

The surgeon’s vast collection of pickled creatures and body parts would later become the nucleus of London’s Hunterian Museum. But last month, when the Hunterian reopened after a lengthy closure, the Irish Giant had been tactfully removed from display. After almost 250 years, John Hunter’s flouting of a dying man’s wishes is catching up with him.

There are, of course, many museums that display the remnants of people wrenched from their graves — or of those never allowed to lie down in them. Stories such as Byrne’s raise uncomfortable questions about this practice. When, if ever, do human remains cease to be human? Does the sanctity of death end at the borders of our own culture and era?

These issues have arisen before. Thirty years ago, the South African government demanded the return of Sara Baartman, a Khoisan woman who in the early-19th century was paraded around Europe, only to be dissected after her death and displayed in a Paris museum until the Seventies. But the morality of displaying human remains has become more broadly contentious in recent years.

In 2020, Oxford’s Pitt Rivers museum removed all of its human exhibits, including shrunken heads from Amazonia’s Shuar tribe, claiming that“visitors often understood the Museum’s displays of human remains as a testament to other cultures being ‘savage,’ ‘primitive’ or ‘gruesome,’” which “reinforced racist stereotypes”. Numerous British and American museums have changed their method of displaying Egyptian mummies, an enormous crowd-pleaser, using terms such as “mummified person” in an effort to humanise the objects.

It is striking, then, how proudly the Hunterian Museum now reveals its gruesome contents to the public. It seems Charles Byrne was omitted because, like Sara Baartman, he is a high-profile case, subject to ongoing controversy after the British Medical Journal covered it in 2011. But the museum is still packed with human remains, presented no differently from the countless animal specimens floating eerily in their glass jars. There is row upon row of skulls gathered from numerous continents, pickled brains, warped spines, infant skeletons, cabinets of teeth, all manner of internal organs, and foetuses ranging from nine weeks to full term. It is a truly ghoulish spectacle.

Hunter claimed to have “dissected some thousands” of human corpses. A small number did consent; the Georgian upper classes were warming to the idea of donating their bodies for scientific enquiry. An Archbishop of Canterbury, several military leaders and a serving prime minister (the Marquess of Rockingham) were among those who volunteered for Hunter’s knife.

But the vast majority who ended up in 18th-century anatomy theatres had no say in the matter. Some were wrestled away from their families beneath Tyburn Tree, the gallows in Hyde Park where dozens of criminals were hanged every year. Others were acquired through the bribing of undertakers. Most commonly though, they were stolen from their graves by gangs of professional body snatchers. Hunter himself almost certainly robbed graves in his youth, when he spent 12 years learning the ropes at his brother’s anatomy school.

The grim provenance of Hunter’s collection is addressed only in a brief wall text at the museum. Acknowledging the specimens were gathered “before modern standards of consent”, it states: “We recognise the debt owed to those people… who in life and death have helped to advance medical knowledge.” Why, then, has the display of Egyptian mummies come to be regarded as a sensitive problem, but less so the display of an unborn child probably removed from the womb of a stolen corpse?

One reason is simply that the humanity of the dead only becomes an issue when someone makes it an issue. The controversy over mummies, for instance, reflects a particular convergence of political beliefs: some modern Egyptians, not to mention the modern Egyptian state, are now identifying as descendants of the ancient civilisation on the Nile. At the same time, Western curators have become desperate to distance themselves from the colonial period during which these objects were acquired. By contrast, there are few people in Britain who feel so strongly about scores of impoverished Londoners pulled from their shallow graves in the dead of night.

But there is another important difference. The claim that Hunter’s activities “have helped to advance medical knowledge” is a powerful one, linking his specimens with the achievements of modern medicine. It is also clearly true. Without a legal way to acquire bodies — and with religious beliefs making voluntary dissection unthinkable to many — only stolen corpses could produce the beginnings of the anatomical knowledge that we take for granted today. The museum subtly emphasises this by charting the development of surgery from the early-modern period to our own time: rather dull after the horror show of Hunter’s collection, but that’s the point I suppose.

Charles Byrne’s skeleton might be too controversial to display, but the museum has insisted on keeping it due to its medical value. It helped an American neurosurgeon to identify pituitary gigantism in 1909, and a century later, allowed scientists to find a genetic component in the growth disorder.

What all of this points to is the special status of medical science in Western countries today. Museums and other cultural institutions are increasingly critical of the heritage they embody because, ultimately, they no longer believe it has served a positive purpose that could mitigate the brutality of the past. This goes far beyond the problem of human remains; as Guardian critic Jonathan Jones notes about Tate Britain’s recent guilt-laden rehang: “Maybe it doesn’t want to promote British art, for it seems to disapprove of much of it.” Yet there are not many people arguing that we should abandon the benefits of modern medicine since it, too, has a disturbing history. This is one area where progress is still understood as building on the past rather than overturning it: the only acceptable agenda for healthcare is more and better.

But Hunter’s collection also reveals a deep tension in the way we value medical science. If we consider it dehumanising to display body parts in jars, it is partly because we now struggle to recognise blood and tissue as human. Our technical mastery over biology has led to our alienation from it. Just as we expect our meat to arrive immaculately packaged in the supermarket, carrying no trace of the abattoir, so we banish birth, illness, and death from our everyday lives, consigning them to the clinical world of the hospital. We have never been more preoccupied with the condition of our bodies, yet we don’t like to see those bodies for what they really are.

Christopher Wren: Godfather of the Technocrats

This article was originally published by Unherd in February 2023.

“You have, ladies and gentlemen, to give this much to the Luftwaffe: when it knocked down our buildings, it didn’t replace them with anything more offensive than rubble. We did that.” That was the famous barb which the then-Prince Charles aimed at modern architecture in 1987. They were not idle words: thanks to this speech, the architect Richard Rogers lost his opportunity to redesign an area next to St Paul’s cathedral, the 17th-century Baroque masterpiece by Christopher Wren.

It was not the last time the curmudgeonly prince would intervene against Rogers, and nor was it the last time Rogers would have to make room for St Paul’s. His skyscraper on Leadenhall Street, known as The Cheesegrater, owes its slanting profile to rules that protect certain views of Wren’s cathedral.

As we mark the 300th anniversary of Wren’s death, it’s worth noting a certain irony in all of this. St Paul’s may well be the nation’s favourite building, and Wren our greatest architect, but he is more like the grandfather of Richard Rogers than his antithesis.

This will strike some as blasphemous: Rogers’ buildings, which include the Centre Pompidou in Paris and London’s Millennium Dome, are technocratic in the most profound sense of the word. With their machine-like forms and conspicuous feats of engineering, they elevate efficiency and expertise into an idol to be worshipped, a religion in its own right. Impressive as these structures are, one can understand why they would make Charles despair whether “capitalism can have a human face, instead of that of a robot or a word processor”.

But technocratic monuments emerge from technocratic societies, where a fixation with how things work drowns out the question of what they are actually for. The natural and human worlds are treated as processes to be managed, with politics reduced to measurable outputs: higher growth, fewer emissions, a more equal distribution of benefits. Technology is held in awe, but more for its functional qualities than any greater purpose it serves.

This could be a description of our own society, and Rogers’ buildings an honest reflection of it. But these tendencies did not appear from nowhere; they have deep roots in the evolution of modern forms of knowledge and power. And those roots lead us back to Christopher Wren.

Picture the scene: it’s 1656, in a house on Oxford High Street. Three members of what will become the Royal Society, an engine of the scientific revolution, are debating whether poison can be administered directly into the bloodstream. Wren, a 23-year-old Fellow of All Souls College, thinks it can. But how to be sure? One of his accomplices, the pioneering chemist Robert Boyle, duly provides a dog, and the experiment begins. Using various improvised instruments, including a syringe fashioned from a goose quill and pig’s bladder, Wren makes an incision in the panicked dog’s leg, successfully injecting opium into a vein.

This episode was later recognised as the first recorded use of an intravenous anaesthetic. (The drugged dog, Boyle reports, recovered after being whipped around the garden for a while.)

It was experimental science like this that occupied Wren for the first two decades of his adult life, not architecture. He was mainly an astronomer, though he pursued all manner of experiments. He studied Saturn’s rings and the lenses of a horse’s eye; he investigated atmospheric pressure and dissected the human brain; he removed the spleen of a puppy and presented the king with a model of the moon.

This was the birth of modern science, but it was also the critical moment when the question of how began to displace the what and the why. These early scientists were driven by a growing realisation that our intuitions did not grasp the true structure of reality. Their response, at least in England, was to turn away from the philosophical mission of understanding what existence is and what its purpose should be, focusing more narrowly on how natural processes work. Already this shift was reflected in a new emphasis on technology: instruments to measure what the senses could not, and gadgets to show mastery of scientific principles. The patterns of the cosmos were mapped in the objective language of mathematics.

This outlook would eventually lay the ground for the technocratic management of society. And Wren, when he turned to architecture, was very much the technocratic type. Employed as Surveyor of the King’s Works, he was a talented but remote civil servant. After the Great Fire of 1666, his job was to Build Back Better, something he managed very well thanks to his relish for bureaucracy and admin. He even wanted to redesign the City of London in the form of a grid. Wren’s classical style did not reflect a deep connection with artistic tradition; it was, as the historian John Summerson pointed out, a matter of good grammar.

In fact, though Wren designed 56 London churches (of which only 23 remain), his readings of scripture suggest an almost Richard Dawkins-like literalism in regard to religion. He liked to demonstrate how astronomy could explain biblical miracles, and treated Samson’s ability to pull down temples as a maths problem. One of his admiring biographers sums up his mindset perfectly: “How a thing worked — whether that thing was a spleen or a comet, a palace or a government department — was as important to him as the end product.”

What really made Wren tick was impressive engineering, mathematical rigour, and finding the most economic solution to a practical puzzle. This is already evident in his first building, the Sheldonian Theatre in Oxford, where he devised an ingenious system of trusses to cover a lengthy span. His London churches show his love of geometry — “always the True test” of beauty for Wren — as seen in their centralised plans, barrel-vaulted ceilings, and their pursuit of symmetry wherever possible.

It’s not entirely a coincidence, then, that both Christopher Wren and Richard Rogers built domes near the Thames. When Wren proposed this idea for St Paul’s, it was, like Rogers’ fiberglass egg, something alien to London, as well as a state-of-the-art engineering display.

It’s true that Wren’s churches don’t feel overly stringent or rational, and this is a sign of his incredible intellect. Their precise balance of spatial and decorative elements can, at their best, create a sense of serene harmony. Nonetheless, later generations often found an artificial quality in his buildings; one 18th-century poet thought him “extremely odd, / To build a playhouse for the church of God”. It’s difficult to judge his work today because it is now contrasted with a modern architecture that fetishises pure geometry, technology, and the expression of structural principles. Yet these are ideas which Wren himself anticipated.

More than that, Wren and his fellow scientists unleashed a quest for knowledge which eventually created a world so complex it needs technocrats to manage it, as well as architects like Rogers to glorify it. It is a great irony that the designer of Britain’s most beloved cathedral also laid the foundations for a society so starved of spiritual meaning.

The Rise and Fall of the Creative Class

This essay was first published at The Pathos of Things newsletter. Subscribe here.

There is nothing inherently creative about vintage furniture, repurposed industrial materials or menus in typewriter font, but if you found yourself in a coffee shop with all of these elements present, then “creative” would be a common way to describe the vibe. Even more so if there was a yoga studio upstairs and a barista with neck tattoos. 

This visual language could also be called trendy or hipster, but the connotations are much the same. It is meant to invoke the imagined lifestyle of an urban creative – someone in or around the arts and media crowd – as it might have looked in Hackney or Williamsburg circa 2010. It signifies an attitude that is cultured but not elitist, cosmopolitan but not corporate, ethical but not boring, laid-back but still aspirational. In its upmarket versions (think more plants, more exotic words on the menu), the “creative” idiom implies a kind of refined hedonism, an artistic appreciation of beautiful and meaningful experiences.

Whether creatives can actually be found in such settings is beside the point, for once a lifestyle has been distilled into aesthetics it can be served to anyone, like an espresso martini. Indeed, the generic symbols of the creative lifestyle – suspended ceiling lights with large bulbs and metal hoods are an obvious example – have now spread everywhere, into Pizza Express restaurants and bankers’ apartments. 

The strange thing is that this triumph of the creative class in the realm of cultural capital has gone hand in hand with its economic evisceration. If you did see an actual creative in our imagined coffee shop – a photographer perhaps, or dare I say a writer – he or she would most likely be working frantically on a laptop, occupied with some form of glorified gig-economy job, or struggling to get a beleaguered business off the ground, or grinding away at a commercial sideline that squeezes out actual creative work.

Everyone wants to buy into the dream of the creative lifestyle, or at least drink cocktails in a place that invokes it, but for most creatives this is as much a fantasy as it is for everyone else.

If there is one institution that can help us understand this state of affairs, it is the Soho House network of private members’ clubs. Founded by the restaurateur Nick Jones, the first Soho House opened in 1995 on London’s Greek Street. It joined a number of exclusive new venues aimed at arts and media professionals, offering, as its website tells us, a place for “like-minded creative thinkers to meet, relax, have fun and grow” – or at least those deemed worthy of membership. In 2003, a Soho House opened in New York’s Meatpacking District, one of the first steps in a dizzying expansion which has seen some forty members’ clubs appear everywhere from West Hollywood to Barcelona, Miami to Mumbai.

In terms of aesthetics, Soho House did a lot to define the “creative” style. Ilse Crawford’s interior design for the New York venue became a landmark of sorts. Ranging over six floors of a converted warehouse, whose raw industrial features were emphasised rather than played-down, it announced the hipster affinity for obsolete, forgotten or simply nostalgic spaces. A bedroom where graffiti had been left on the wall was apparently a members’ favourite. Crawford’s furnishings likewise set the trend for combining antiques, modern design classics and found objects in an eclectic approach that tried to be both modern and comfortable.

For all its apparent authenticity and bohemian flavour, this style has since been exported around the world as seamlessly as McDonalds, not least within the Soho House empire itself. The brand prides itself in giving every venue a local accent, but it has really shown the uncanny way that a design formula can make very different settings look the same. 

In my reading, what all this illustrates is the emergence of a new creative elite – film producers and actors, fashion designers and models, publishers and magazine editors, musicians, advertising executives and so on – whose ambition and self-confidence were such that they did not want to merge with the existing circles of privilege. The exclusivity of these members’ clubs, buttressing the special status of “creativity,” was not about keeping the plebs out. It was about drawing a distinction with the philistines of the City of London and Wall Street, and with the stale elitism of Old Boys Clubs.

“Unlike other members’ clubs, which often focus on wealth and status,” explained the Soho House website a few years ago, “we aim to assemble communities of members that have something in common: namely, a creative soul.” When they first appeared, the brand’s distinctive aesthetics drew a contrast, above all, with the slick corporate interiors of luxury hotels in the 1990s. No suits was both the dress code and a principle for assessing membership applications, and those deemed “too corporate” have on occasion been purged. Another functionally equivalent measure was the “no-assholes rule,” though this did not stop Harvey Weinstein from winning a place on the initial New York membership.

But crucially, there is a wider context for the appearance of this creative elite. The first Soho House opened against the backdrop of rising excitement about the “creative industries,” a term adopted by Britain’s New Labour government in 1998. This idea was hopelessly baggy, grouping together advertising and architecture with antiques and software development. Nonetheless, it distilled a sense that, for the post-industrial societies of the west, the future belonged to those adept in the immaterial realms of communication, meaning and desire. In technical terms, the organising principle for this vision was to be the creation and control of intellectual property.  

Economists insisted that, beyond a certain income threshold, people wanted to spend their money on artistic and cultural products. A supporting framework of information technology, university expansion, globalisation and consumer-driven economic growth was coming into view. And a glimpse of this exciting future had already appeared in the Cool Britannia of the 1990s, with its iconoclastic Young British Artists, its anthemic pop bands, its cult films and edgy fashion designers. 

Institutions like Soho House provided a new language of social status to express these dreams of a flourishing creative class, a language that was glamorous, decadent, classy and fun. The song Ilse Crawford used when pitching her interior design ideas – Jane Birkin and Serge Gainsbourg’s 69 Année Érotique – captures it nicely, as does a British designer’s attempt to explain the club to the New York Times: “Think of the Racquet Club but with supermodels walking through the lobby.” The appeal of this world still echoes in the fantasy of the creative lifestyle, and surely played a part in persuading so many in my own generation, the younger millennials, to enter creative vocations.

So what happened? Put simply, the Great Financial Crisis of 2007-8 happened, and then the Great Recession, rudely interrupting the dreams of limitless growth to which the hopes of the creative industries were tied. In the new economy that formed from the wreckage, there was still room for a small elite who could afford Soho House’s membership fees, but legions of new graduates with creative aspirations faced a very different prospect. 

Theirs was a reality of unpaid internships and mounting student debts, more precarious and demanding working livesa lot of freelancingpoor income prospects, and reliance on family or second jobs to subsidise creative careers. Of course some of the pressures on young people in the arts and media, like high living costs and cuts in public funding, vary from place to place: there is a reason so many moved from the US and UK to Berlin. By and large though, the post-crash world starved and scattered the creative professions, squeezing budgets and forcing artists into the grim “independence” of self-promotion on digital platforms. 

One result of this was a general revulsion at capitalism, which partly explains why artisan ideals and environmentalism became so popular in creative circles. But despite this skepticism, and even as career prospects withered, the creative lifestyle maintained its appeal. In fact, the 2010s saw it taking off like never before. 

Young people couldn’t afford houses, but they had ready access to travel through EasyJet and AirBnB, to content through Spotify and Netflix, to a hectic nightlife through cheap Ubers, and they could curate all of these experiences for the world via Instagram. They could, in other words, enjoy a bargain version of the cultured hedonism that Soho House offered its members. The stage-sets for this lifestyle consumerism were the increasingly generic “creative” spaces, with their exposed brick walls and Chesterfield armchairs, that multiplied in fashionable urban districts around the world.

Perhaps the best illustration of this perverse situation is the development of the Soho House empire. Alongside its exclusive members’ clubs, the company now owns a plethora of trendy restaurant chains for the mass market. You can also get a taste of the Soho House lifestyle through its branded cosmetics, its interior design products, or a trip to one of its spas. With new membership models, freelancers can take their place among the massed vintage chairs and lamps of the brand’s boutique workspaces. There was even talk of going into student accommodation.

And so an institution that symbolised the promise of a flourishing creative class now increasingly markets the superficial trappings of success. As a kind of compensation for the vocational opportunities that never materialised, creatives can consume their dreams in the form of lifestyle, though even this does not make them special. The 2010s were also the decade when the corporate barbarians broke into the hipster citadels, occupying the clothes, bars and apartments which the creative class made desirable, and pricing them out in the process.

In one sense, though, the “creative industries” vision was correct. Intellectual property really is the basis for growth and high incomes in today’s economy; see, for instance, the longstanding ambition of the Chinese state to transition “from made in China to designed in China.” But the valuable intellectual property is increasingly concentrated in the tech sector. It is largely because IT and software are included that people can still claim the creative industries are an exciting area of job creation.

The tech world is, of course, a very creative place, but it represents a different paradigm of creativity to the arts and media vocations we inherited from the late-20th century. We are living in a time when this new creativity is rapidly eclipsing the old, as reflected by the drop in arts and humanities students, especially in the US and UK, at the expense of STEM subjects. Whether tech culture will also inherit the glamour of the declining creative milieu I can’t say, but those of us bred into the old practices can only hope our new masters will find some use for us.

This essay was first published at The Pathos of Things newsletter. Subscribe here.