Paul Farrington

Once upon a time, schools of architecture displayed plaster casts of Ionic capitals and Renaissance portals for the edification of their students. Visit any school today and you’re likely to encounter, either in one of the corridors or standing outside the building, structures resembling giant three-dimensional jigsaw puzzles made of interlocking pieces of laser-cut plywood. Such constructions, no less iconic than the old plaster casts, are the product of classes in the academy’s current architectural obsession—parametric design.

Google parametric design and the first site that you will find is not a Wikipedia entry but a blog, Rethinking Architecture. The author, a Polish architect named Jaroslaw Ceborski, is rather vague about definitions, but he writes enthusiastically: “It’s quite easy to distinguish something designed using parameters and algorithms from the rest, so it gives us a message, ‘I’m contemporary, I was rethinked.’ ”

Tangled grammar aside, Ceborski captures the preoccupation with parametric design to create new “contemporary” forms, as evidenced regularly in student projects, and less frequently in the façades of trendy boutiques, edgy condominiums, and upscale department stores. One of the largest built examples is Foreign Office Architects’ cruise ship terminal in Yokohama, Japan, a pier whose sinuous walking surface is said to have been inspired by traditional wave paintings. According to a primer on parametric design by the AIA California Council, this project proves that “complex building forms correlated to a series of imagined or perceived parameters could be organized and constructed on a grand scale with dynamic, real-world results.”

“Imagined or perceived parameters” sounds pretty arbitrary. Indeed, the algorithms that underlie parametric modeling are altered seemingly at will, and can rapidly churn out a variety of forms from among which the designer can choose. Perhaps that’s why parametric design is so popular with students. Renzo Piano, Hon. FAIA, once told Architectural Record, “You know, computers are getting so clever that they seem a bit like those pianos where you push a button and it plays the cha-cha and then a rumba. You may play very badly, but you feel like a great pianist.”

Even in experienced hands, parametric programs can produce alarmingly undisciplined results. The 2010 Guangzhou Opera House by Zaha Hadid, Hon. FAIA, is a poster child for the caulking industry. The Harvard University historian Antoine Picon, author of Digital Culture in Architecture, observes that “the capacity of the computer to transform almost every formal choice into a viable constructive assemblage reinforces the possibilities offered to the architect to play with forms without worrying about their structural implications too much.” The disadvantage of this play, which he also points out, apart from elevated construction costs—and caulking issues—is that the morphological forms produced are oblivious to the past. This gives parametrically designed buildings an up-to-the-minute quality. Although they look sci-fi futuristic, they are also curiously one-dimensional, for nothing ages faster than yesterday’s vision of the future. Just ask Jules Verne.

A view from above of the four-story lobby in Zaha Hadid’s Guangzhou Opera House in China, whose sinuous shapes were computer-generated.
Virgile Simon Bertrand A view from above of the four-story lobby in Zaha Hadid’s Guangzhou Opera House in China, whose sinuous shapes were computer-generated.

Not all parametrically designed buildings are “architecture rethinked.” In the hands of Nicholas Grimshaw, AIA, and Norman Foster, Hon. FAIA, computational tools are used in the service of mainstream Modernism, as with the curved structure of Grimshaw’s Waterloo International Terminal in London, or Foster’s undulating courtyard roof of the American Art Museum and National Portrait Gallery in Washington, D.C.

The spherical geometry of the ArtScience Museum of Moshe Safdie, FAIA’s Marina Bay Sands in Singapore is based on a series of spiraling and converging arcs. The first parametric studies were done on the graphics software Maya, according to Safdie principal Jaron Lubin, Assoc. AIA. “The team built the model such that one could adjust isolated geometric parameters to test different design options very quickly.” Later, the architects shifted to Rhino, to share 3D information with structural engineers at the global design firm Arup, which pushed the information into GenerativeComponents, a parametric program that integrates with Building Information Modeling.

Then there is Patrik Schumacher, who has promoted what he (awkwardly) calls “parametricism,” not merely as a useful tool, but as the enabler of an entirely new kind of architecture, a new aesthetic. Parametricism means no more axes, no more regularity, no more symmetry—nothing that smacks of the great architecture of the past. “Avoid repetition, avoid straight lines, avoid right angles, avoid corners, avoid simple repetition of elements,” he advises in the defining manifesto he wrote for the 2008 Venice Architecture Biennale. “Hybridize, morph, deterritorialize, deform … consider all forms to be parametrically malleable.” Stated that way, parametricism sounds as if it has more to do with taste than with problem-solving.

Schumacher describes parametricism as a deliberate response to an increasingly heterogeneous society. “The task is to develop an architectural and urban repertoire that is geared up to create complex, polycentric urban fields, which are densely layered and continuously differentiated,” he writes.

That society has become more fragmented and heterogeneous is unarguable, but the conclusion that a fragmented public wants—or needs—a fragmented architecture strikes me as idiosyncratic. What characterizes modern society is not confusion, but a confusion of choices—in movies, music, entertainment, information, food, and dress. No wonder we have such a wide range of building designs: traditional as well as avant-garde, familiar as well as unusual, Cartesian as well as morphological. Parametricism may be one answer—although exactly to what question remains unclear—but it’s certainly not the answer.

Christopher Alexander’s book, Notes on the Synthesis of Forms, includes this diagram, intended to illustrate how design problems can have a series of linked variables that operate in independent subsystems.
Christopher Alexander’s book, Notes on the Synthesis of Forms, includes this diagram, intended to illustrate how design problems can have a series of linked variables that operate in independent subsystems.

Is the most effective use of parametric software simply to generate unusual forms? Architects have been deliberating on how best to use the computer ever since Ivan Sutherland invented Sketchpad (the ancestor of CAD) in 1963. Two years later, a seminal meeting on “Architecture and the Computer” took place at the Boston Architectural Center. In attendance were such luminaries as Walter Gropius, Yale’s Serge Chermayeff, the structural engineer William LeMessurier, and Marvin Minsky, the co-founder of MIT’s artificial intelligence lab. The architects imagined that computation would take over repetitive operations in the design process, but Minsky (correctly) predicted that the computer held much more in store. “We can use a computer to execute a procedure that is not just more tedious,” he said, “but more complicated than anything we can ask humans, including ourselves, to do.”

Complexity was precisely the concern of Christopher Alexander, an architect who that same year published Notes on the Synthesis of Form, a small book with an ambitious message. “My main task has been to show that there is a deep and important underlying structural correspondence between the pattern of a problem and the process of designing a physical form which answers that problem,” Alexander proclaimed. His thesis was that any design problem could be rationally broken down into overlapping subsets of functional requirements, and that these sets had a hierarchical relationship. He gave a kettle as an example, and listed 21 specific patterns that governed its design: “It must not be hard to pick up when it is hot,” “It must not corrode in steamy kitchens,” “It must not be hard to fill with water,” and so on.

Alexander’s requirements, or “misfit variables,” as he called them, follow the dictionary definition of a parameter—“a measurable factor forming one of a set that defines a system, or sets the conditions of its operation”—but his approach was parametric in a different sense than Schumacher’s. Alexander didn’t want simply to create more complex forms, he wanted to unravel the complexity of design problems.

In an appendix to the book, Alexander outlined a mathematical model that mapped the requirements of design problems. It was natural that he would turn to computation, since his dual degree from Cambridge was in mathematics as well as architecture. He and Marvin Manheim, an engineer specializing in information technology, wrote an IBM 7090 program that was published as an MIT research report titled “HIDECS 2: a computer program for the hierarchical decomposition of a set which has an associated linear graph.”

As a student I devoured Notes on the Synthesis of Form, and a classmate and I got hold of the program, intending to use it in our thesis projects. HIDECS 2 was written in Fortran, and I recall laboriously entering the information onto stacks of punch cards. We couldn’t get the program to run, however. Dismayed, we went back to working the old way, with soft pencils and yellow trace. I was later told—I don’t know if this is true—that HIDECS 2 simply had too many glitches.

Oddly enough, Alexander himself had serious reservations about the use of computers in architecture. He was unable to attend the Boston meeting, but he did contribute an iconoclastic essay to the proceedings. “In the present state of architectural and environmental design, almost no problem has yet been made to exhibit complexity in such a well-defined way that it actually requires the use of a computer,” he wrote. Alexander saw a real danger in architects’ fascination with computing. “The effort to state a problem in such a way that a computer can be used to solve it will distort your view of the problem. It will allow you to consider only those aspects of the problem which can be encoded—and in many cases these are the most trivial and the least relevant aspects.” This could still serve as a warning to the eager parametricists of today.

Since Alexander wrote that, a different application of the computer in architecture has emerged: building simulation. This computational tool models building performance in areas such as structure, energy, daylighting, artificial illumination, and acoustics. I asked Ali Malkawi, director of the T.C. Chan Center for Building Simulation and Energy Studies at the University of Pennsylvania, what role parametric design plays in his field. “In the building-energy-related area, parametric design is currently being used to search for energy-efficient solutions in façade design, optimal window sizing relative to lighting, and other similar applications,” he said. “It’s still very elementary and not widespread. Mostly it’s used by academics in experimental classes, as well as by some consultants.”

In his own 2004 research paper, Malkawi described how a genetic algorithm, which mimics the process of natural evolution, could be combined with computational fluid dynamics to evaluate and optimize different design alternatives with respect to thermal performance and ventilation. However, he cautions that computer-generated designs based on performance targets are still some distance in the future. “Parametric design cannot provide comprehensive solutions due to the fact that the basic physics-based algorithms integration problem is still far from being solved.”

What Malkawi means is that current building simulations treat environmental domains such as heating, air conditioning, ventilation, and daylighting separately, rather than as integrated wholes. Moreover, while heat and light are relatively simple to model, phenomena such as natural ventilation—a staple of “green” buildings—have scores of unpredictable, external variables, and have so far resisted precise modeling. Another limitation of today’s building performance simulations is the dearth of what Alexander called “well-defined problems”—that is, a lack of coherent data. It is easy to determine the R-value of a wall, or the reflectivity of a surface, for example, but the dynamic energy performance of an entire building is also governed by its occupants’ behavior: opening and closing windows, turning light switches on and off, raising and lowering blinds, and adjusting thermostats. Research on modeling human behavior is still in its infancy.

Somewhere between the vagaries of parametricism and the analytical precision of building simulation lies the Holy Grail: design informed by data gleaned from how buildings actually perform, and how people actually behave in them. This would require integrating building simulations, creating interaction between different domains, incorporating a myriad of variables, and, above all, devising a dynamic approach that accounts for the vagaries of human behavior, both over time and between individuals.

Even if the data for such a model were available, the question remains whether the immense difficulty of solving an “ill-defined problem”—for that is what a building is—would not overwhelm the solution, and whether the required computational complexity would be manageable, let alone affordable. Don’t put away the soft pencils and yellow trace just yet.