Cultural influences in design, The

passing of post-modernism: Cultural influences in design, The

Zwirn, Robert

If you were considering a building in the post-modern idiom, you are a bit late. In the fast-paced world of style and fashion, post-modernism is passe’. The latest movement in architecture is Deconstruction, or decon as it is affectionately, or derisively, known. Theoretically this new vision of the built environment is based on French literary and linguistic analysis. Visually it harks back to the Russian work of the deconstructivists in the years immediately following the Revolution.

What passes for the avant garde in architecture has moved on, leaving post-modernism to the second-tier architects and the shelter magazines — the cultural equivalent of trickle-down economics. You do not even have to hire an architect, or even build a building; you can buy a Michael Graves teapot or Robert Stern designer sheets. To be sure, post-modern buildings are still being designed and constructed while most decon work is going on at schools of architecture and will never be built. And despite deconstruction’s substantive pedigree of French thought and Russian expression, it appears to be simply another stylistic riff in our insatiable late twentieth-century hunger for the novel.

Unless you read certain journals or frequent a few museums in New York or Los Angeles, you are not likely to be aware of any of this. Architecture, once the most tangible manifestation of our culture, has become fashion — transitory and trivial. Architects still have the responsibility of responding to their era and its place in the chronological continuum, but we have found that simply giving our time in history a name is an easier and more lucrative endeavor than coming to terms — emotionally and intellectually — with the zeitgeist. So long as we can name, we need not understand. “Post-modern,” “deconstruction,” even “modern” are no longer keys that unlock a time and place as do “gothic” or “renaissance.” They are more like those clever graphic signs that distinguish between the genders on restroom doors.

THE MODERN URGE TO NAME

Perhaps people have always had the ability to name or label the era in which they live. The desire to do this appears to be a much more recent phenomenon. The identification of a zeitgeist was once the product of retrospection and archaeology. Today, that labelling takes place before the events and accomplishments of the era can be evaluated. In some extreme cases the name precedes even the events — a late twentieth-century take on the emperor’s new clothes.

Those of us who teach history have long regaled our students with cautionary tales about accumulating evidence before judgment or identification. Time and detachment are elemental. Epochs do not begin on a given day, month, or year. We employ clever anecdotes to make our point — folks did not wake up sometime in fifteenth-century Florence and say, “Gee, I have this urge to study antiquity and science and stop relying on superstition I guess this must be the Renaissance.”

That lesson is a tough sell in today’s classroom. We live in an age when information is instantaneous and pundits lie in wait on television, radio, and the World Wide Web eager to tell us which age is dawning and why.

Our forebears were less audacious. They were loath to label. They knew the tremendous responsibility that comes with naming. Although rare, there were those who stepped out of their epoch to see it whole. Usually, the impetus was religious or moral. Often it involved great personal peril. The Bible tells us of prophets — mystical men who often paid a heavy price for telling the people where they were in the great continuum. The classical world had philosophers. The Enlightenment had scientists. Even the so-called Dark Ages had hermits and alchemists. We still read the words of these folks, some of whom are several millennia removed. Few, if any, were given to nomenclature.

In our “information age” where we daily accept such chronological oxymorons as “pre-boarding” or “stock futures,” labeling has become a lucrative lark rather than a burdensome responsibility. It is the ultimate triumph of form over substance – one needs only hubris and volume to be a pundit.

The perils are still great, but they are no longer personal. Those who label do so with impunity. No commentator is banished for being premature, or wrong. In the process we have lost authenticity — integrity, sincerity, straightforwardness. We are left with the inauthentic, the fake. Every aspect of our culture has suffered this indignity, architecture, the mother of the arts, perhaps a bit more than others.

Architects, that odd mix of artist and artisan, of visionary and pragmatist, have long relied on nomenclature as a way of dealing with culture’s complexities. This reliance dates at least to the nineteenth century and the amassing of America’s first, great industrial fortunes. The ensuing taste for eclecticism gave architects and their patrons the impetus to identify and demand myriad histories and the styles that were associated with them. Archaeological discoveries in the classical world gave gravity to the capricious notions of the newly rich. The rapid accumulation of great wealth and scant concern for introspection, let alone retrospection, allowed us to forget that such labels as classical and gothique were not products of their own time, but constructs formulated in much later eras. Such historical constructs were based on a trinity of convergence — the desire to understand the past, the tools to analyze it, and the distance from which to appreciate it.

The sobriquet post-modern came not through this trinity. In order for post-modernism to be proclaimed and birthed, history was maligned, misunderstood, and misappropriated. As with other epoch-heralding, the post-modern pronouncement was premature. Post-modernism’s demise after a scant two decades was a death foretold.

MODERNISM AND POST-MODERNISM

The two events often cited as the progenitors of the post-modern are the publication of Robert Venturi’s Complexity and Contradiction in Architecture, in 1966, and the dynamiting of architect Minoru Yamisaki’s ill-conceived and mismanaged Pruitt-Igoe (1956-74) subsidized housing project in St. Louis. In a discipline often split along theoretical and practical lines it is not surprising that such different events would be yoked as seminal.

Despite the essential dissimilarities between a set of ideas and their tangible manifestations, there is a strong correlation between building and book. Post-modernism, from its beginning, was an often violent reaction against modernism. Both Venturi’s polemic and St. Louis’s obliteration of a totem are powerful, but singularly negative responses to problems (mis)laid at the door of modernism.

For the purposes of this essay and for conformance with generally accepted thought, modernism had its birth in central and western Europe in the period immediately after the first World War. While its nascence can be seen in the work of several late nineteenth-to early twentieth-century Europeans, its most influential architects were Walter Gropius (1883-1969) and Ludwig Mies van der Rohe (1886-1969) in Germany, and Charles-Edouard Jeanneret, the Swiss who took the name Le Corbusier (1887-1965), in France. Whatever their built accomplishments — and they were considerable — one could argue that their influence was as much the result of their writing and theoretical work. Le Corbusier’s notion of the house “as a machine for living” and Mies van der Rohe’s powerfully epigrammatic “Less Is More” have become part of our cultural parlance.

Both Gropius and Mies settled in the United States when it became clear that Hitler and Albert Speer did not see the Bauhaus (the School of Design in Dessau, begun by Gropius and later headed by Mies) as part of the Third Reich’s future. Gropius ran the Graduate School of Design at Harvard, numbering among his students Philip Johnson, I.M. Pei, Edward L. Barnes, and Paul Rudolph. Mies oversaw things at the Armour Institute, now the Illinois Institute of Technology, in Chicago.

Even before their arrival in the United States, the influence of European modernism was already being felt. Philip Johnson, in one of his several pre-architect incarnations, was the first curator of architecture at New York’s Museum of Modern Art. (Philip Johnson would later be the architect of what many consider the first post-modern building of consequence — the AT&T tower in Manhattan. Later, Johnson would appear again, as the curator of the Museum of Modern Art’s Deconstructivist show.) He and his friend, the architectural historian Henry Russell Hitchcock, journeyed to Europe in the late 1920s to see, speak to, and collect the work of the modernists.

The exhibit and the book that followed were dubbed The International Style. Had not the Depression and the Second World War intervened, it is likely that European modernism would have had currency earlier In the years after the war the international style, with its glass, volumetric, asymmetrical, flat-roofed buildings, shorn of all applied ornament, would come to dominate corporate and urban America. It was against this vision that the post-modern movement reacted.

By the early 1960s, what Venturi saw was an architecture devoid of humanity. The meta-rational stripping of surface embellishment, the active ignoring of the lessons of historical and regional tradition, and the assembly-line esthetic were, for Venturi, anathema. The sleek architecture of corporate America was seen, by the late 1960s, as one more example of the commodification of our lives and the growing alienation between corporate and popular cultures. In an era of student unrest, anti-war protests, bus boycotts, pop art, and free love, the glass boxes lining our city streets and the “clean and sanitary” high-rise repositories for the poor were no longer seen as avant-garde, or benevolent social experiment, but as anachronistic bastions of a ubiquitous power structure. We began to wonder if perhaps “progress were (not) our most important product.”

In the same way that the European modernists blamed the old order of the nobility and the clergy for World War I, so the post-modern movement gained its initial strength from its reaction against the military-industrial complex — ironically first brought to our attention by a departing President Eisenhower. It was ironic because as the modernists had seen the pre-industrial era of the nineteenth century as the cause of all human suffering, so the radicals of the 1960s saw the post-industrial era in the same way.

While the modernists followed the path of the Enlightenment — putting their faith in the rational — the Aquarius generation fled the industrialized cities and the Eisenhower suburbs for a return to an imagined, romantic utopia of nature and community. Vernacular architecture became a talisman of truth and goodness, and Mies’ Germanic “Less Is More” was replaced with Venturi’s Italianate “Less Is a Bore.” Lang’s Metropolis became Fellini’s La Dolce Vita. The house was no longer (if indeed it ever was) seen as a Corbusian “machine for living,” but as a cultural icon designed to bring us in touch with our regional and historical roots.

In their time both the modernist and the post-modernist polemic had considerable power and tremendous value. On this side of the Atlantic, the architectural manifestation of the polemic was shorn of its political imperative. What The International Style did for modernism was to strip it of its socio-political roots and give American architects and their patrons a cookbook — if not for modernity, at least for the image of modernity. Given corporate America’s appetite for efficiency and image, the steel and glass of Mies and his imitators became the coin of the corporate realm. Indeed, if Mies himself ever had a political agenda, and there is evidence that he did not, it was quickly lost in the headiness of America’s post-Second World War power and prosperity.

THE CO-OPTING OF POST-MODERNISM

Sadly, post-modernism suffered the same fate. In the wake of consumerism, commodification, and gentrification, the polemics about humane, vernacular, and historical issues were soon co-opted by the now multi-national corporate megalith. In the hands of even the nation’s most talented practitioners, post-modernism, the consciousness, became post-modernism, the style. The fast-buck, borrowed-money wealth of the 1980s bought a garish and false history, much as the robber barons had bought a meretricious and specious history a century earlier. Yet, despite this, or perhaps because of it, the modernist split between corporate and popular cultures was all but obliterated. Disney, one of the prime movers of post-modernism, has succeeded in making the corporate culture the popular culture. In that venture post-modernism has succeeded beyond even Robert Venturi’s imagination.

This co-opting of popular culture is most poignantly read in the fate of our urban, vernacular working-class housing stock. In the wake of the well-intentioned preservation movement, the consumer fetishes fueled by Madison Avenue, and the steady decline of working-class and labor-union political power, the vernacular of the inner cities has become the gentrified and dandified home of the new rich.

The neo-traditional planning of new pedestrian-oriented communities like Seaside, Florida, that began as a reaction against suburban sprawl, has become the kitsch backdrop for an upper-middle class that neither sat on front porches nor would know how.

Those whom the post-modern polemic was going to help — the disenfranchised, with limited social and political resources — have been relegated to the decaying modernist suburbs. In the same way, the working class that was to benefit from modernism was warehoused in banal high rises while the wealthy furnished their 1950s penthouses with Mies’ Barcelona chairs and Corbu loungers.

Thus, rather than the lessons of the past being revisited and restored, as would have happened if our epoch-heralding had followed the trinity of understanding, analysis, and appreciation, post-modernism has given us nostalgia — in quaint, re-created communities that never were — furnished with buildings as fashion. This misuse of the vernacular as costume prevents us from recognizing the value and adaptability of the vernacular as knowledge. Nostalgia, as Fran Liebowitz has noted, is what you get “when you have no ideas.”

In 1899 Louis Sullivan, a modern prophet, who paid dearly for his efforts to tell his contemporaries where they were in the continuum, addressed the Chicago Architectural Club:

Accept my assurance that (the architect) is and imperatively shall be an interpreter of the national life of his time… you are called upon, not to betray, but to express the life of your own day and generation… a fraudulent and surreptitious use of historical documents, however suavely presented, however cleverly plagiarized, however neatly repacked, however shrewdly intrigued, will constitute and will be held to be a betrayal of trust.

In our rush to label and predict and to blame another epoch for our own sins, we lost the passion and the authenticity of what might have become post-modernism. Today, we see it as just another style, “suavely presented” or not. Labeling an era is a daunting responsibility. Labeling your own era may well be the ultimate folly. There is, after all, a reason why children do not name themselves. There is something terrifically self-conscious about naming. And self-consciousness should never be confused with self-awareness.

Robert Zwirn, AIA, is the director of the School of Architecture at Louisiana State University. Before going to Baton Rouge, he was the chair of the Department of Architecture at Miami University at Oxford, Ohio and worked with I. M. Pei and Partners (now Pei, Cobb, Freed and Partners) in New York City.

Copyright National Forum: Phi Kappa Phi Journal Spring 1996

Provided by ProQuest Information and Learning Company. All rights Reserved