
While I applaud the basic idea of the University of Minnesota’s new one-year research program, described in the January issue of ARCHITECT, I have a fundamental problem with one aspect of it, or rather its implications: it emphasizes only knowledge, as far as I can tell, that is science- and technology based. In integrating practice, research, and learning, and offering a shortcut to licensing after a B.Arch., it also points to the ongoing erosion of what I consider to be one of the foundations not only of our academic system, but also of American society in general, which is the notion of a four-year, liberal arts, undergraduate degree program.
I am one of those who would have benefited from this new program, which lets students engage in directed research, practice, and community activities in a manner that gains them IDP credit, while letting them take licensing exams along the way, so that they can practice as soon as they graduate. When I graduated with an M.Arch., I taught and worked for a licensed architect for close to two years, but my remaining experience was with an architect who was then not licensed (the husband-and-wife principals took their exams shortly thereafter). When I went to register for my exam, I found out that California had just changed its laws to disallow academic experience. I have never become licensed.
I also teach on occasion at the University of Cincinnati, whose co-op program is famous (or notorious) for giving students oodles of so-called real-life experience even before they graduate. The result is well-prepared students who receive strong reviews from practitioners but who, in my opinion, are not always well prepared enough to know what to do with all their technical skills.
One of the things that sets the American education system apart—though many other countries are now copying it—is the notion that many students spend four years winding their way through academic programs in which they try out various forms of knowledge, learn a great deal of stuff that has no practical purpose, and earn degrees in areas that might not be directly related to what their jobs might be. I believe this is good. What they are learning is how to be informed citizens. They come to understand, if the college is any good, where we are today, where we have come from, and where we are going. They read literature, find out about history or economics, dabble in art or astrophysics, and in general receive a broad overview of what our society knows. Then they either go out into the real world and learn by doing, or receive graduate degrees by obtaining specialized knowledge.
Of course that is a romantic picture, and one that is also subject to constant erosion as we try to make our educational system, whose costs have risen out of all proportions to everything else except healthcare, more affordable. Yet I believe that core notion is worth holding onto, and for that reason I have long felt that architecture, as a specialized discipline, should only be taught at a graduate level. Let students build up a strong base of knowledge, analytic abilities, and experiences, and then have them learn how to translate that all into the world of architecture. Architecture as a way of knowing your world through its physical, human-made structures, on the other hand, is an excellent undergraduate major. It should function as English, history, or any other such broad form of knowledge: as a way of learning about the world you inhabit through a particular discipline.
It is a quaint and perhaps conservative model. It is not efficient, and as the article I reference points out, it means that it takes a great deal of time to get to the point that you are allowed to practice. I think that is fine, though I agree that the whole process should be streamlined. I think the larger problem is that most architects do not make as much money as doctors or lawyers when they graduate, which means that the rewards awaiting architecture students at the end of a long march towards licensing are not sufficient to retain many of them. This is part of the reason why architecture is still a lily-white and middle-class profession. I think that trying to solve that by making the education system shorter and more efficient is putting the cart before the horse. You could make the argument that directed research allows for a different kind of knowledge, one that does combine theory and real-world structure. I remain skeptical. Not everything is data. Some things are just beautiful, strange, or difficult to understand, but fundamental to what makes our shared culture. That is where architecture should (also) have its foundation.
Aaron Betsky is a regularly featured columnist whose stories appear on this website each week. His views and conclusions are not necessarily those of ARCHITECT magazine nor of the American Institute of Architects.