In his 1984 science-fiction novel "Neuromancer," author William Gibson envisioned a future world manipulated by cybernetically augmented mercenaries. His characters harness the power of digital networks by connecting via brain implants or head-mounted electrodes to what he presciently labelled as “the matrix.” Gibson’s bold vision helped to launch the cyberpunk genre and influenced a variety of dystopian films such as The Matrix and Johnny Mnemonic. Although these bleak depictions of the future seem to have little in common with the realities of contemporary society, brain-computer interfaces are no longer the stuff of fiction. Rather, they’re at work in many applications today.
Specifically, electroencephalography (EEG) devices measure electrical activity in the brain based on voltage changes in the current flowing between neurons. First tested on animals in the late 19th century, EEG is now routinely used in the study of epilepsy and brain tumors in humans. Recently, companies such as NeuroSky and Emotiv have made the technology accessible to consumers. Fortunately for the user, these devices measure brain activity through the scalp—no implants are required.
The commercial availability of these interfaces marks a shift from the primarily clinical use of EEG technology to its application in a variety of fields, including design. For example, digital artist and musician Andreas Borg empowered users to create collections of graphic patterns with brain waves via the Alhambra Mandala interface, shown in the video above. In their project Trataka, researchers at the University of Arts and Industrial Design in Linz, Austria, enabled people to modify the intensity of a flame using their minds. A team of National Instruments researchers even demonstrated the ability to control mobile robots using brain waves. And the Cloud Lab at Columbia University’s Graduate School of Architecture, Planning and Preservation (GSAPP), directed by professors Toru Hasegawa and Mark Collins, regularly employs EEG technology in their research exploring spatial problem solving and environmental mapping—pursuits they call “brain hacking.”
At the 2015 National Conference on the Beginning Design Student held in February at the University of Houston in Texas, Meg Jackson and Michael Gonzales, both professors in the university’s Hines College of Architecture, presented their own compelling forays into the adoption of EEG technology to control architectural interfaces. In their talk, “Prototypes + Craft: A Hybrid Approach to Beginning Interactive Design,” the pair discussed the design and construction of various interactive interfaces by their graduate and undergraduate students. Through what they call the Mind Manipulator project, a small group of those students created a dynamic architectural surface that changes in form based on the brain activity of the person interacting with it. The endeavor was inspired by Gonzales’ graduate work at GSAPP, where he learned to manipulate virtual geometry and physical prototypes using EEG technology. An exploration of Jackson and Gonzales’ recent seminar, which undertook the Mind Manipulator project, reveals insights about the further implementation and future possibilities of EEG in architecture.
First, students developed simple responsive prototypes using an Arduino microcontroller. This hands-on exercise was designed to help them better understand the fundamental concepts of sensing, control, and feedback prior to tackling the more complex strategies of interaction and mechanics. Early prototypes, called puppet models, let students study mechanics and materiality before troubleshooting sensor technologies. The students then created studies for architectural panels inspired by fluid-shaped cast concrete using fabric molds. The panels comprised stretched fabric with an underlying string-based grid system. By varying the tension in the strings, the students were able to manipulate the panel’s surface geometry, revealing an undulating pattern milled with a CNC machine.


The students then needed to transform the system into one that would communicate between the user’s brain and the surrounding physical environment. Jackson and Gonzales saw the tension-based panel as a natural conceptual fit for EEG because it can detect the attentive and meditative states of the brain. The students used NeuroSky’s Mindset, which communicates with a computer via Bluetooth. Jackson and Gonzales created a program called Nematode, which allows the Mindset to exchange information with Grasshopper, a plug-in for Rhinoceros. Nematode acquires raw EEG data and translates the information into integers in Grasshopper, which in turn sends signals to servos that the students had connected to the panel.
“Through a series of interactive brain-training games and physical models, the team was able to calibrate their initial prototype to allow seamless communication between a participant’s mental activity and the servos controlling the tension of the fabric,” Jackson explained. “As a participant enters into an attentive state, servos pull on the fabric revealing the patterned relief. The more attentive a participant is, the greater the resolution of the pattern. Conversely, as participants enter into a meditative state, the fabric is relaxed, returning to its unstressed state.”
During their initial trials with the Mindset, students had to train their minds to attain acutely attentive and meditative states. They learned, for example, that focusing on particular patterns aided in achieving attentiveness—though meditative states were more difficult to achieve. Getting the fabric panel to react physically required a phase of trial and error calibration with both mental states. “There is a gradient between mental states,” Jackson said. “It is not as simple as directly relating to the set of numbers. We and the team had to discuss the actual data, its translation into integers, and the subtleness of the transitions of the mental state.” The grid below shows the movement of the fabric from a meditative state to an attentive one.

That the students were able to affect shape-shifting architectural surfaces by thought alone is surprising indeed, and it raises questions about the future use of EEG technology in the design, construction, and occupation of buildings. For example, architects could test parametric design iterations in physical models—even during client meetings—without the need to contact the models directly. Contractors could manipulate materials and assemblies on a job site, especially in hazardous circumstances, from a safe distance. And building envelopes could even be designed to communicate collective levels of attentive or meditative activity among their occupants—becoming accentuated during the day and relaxed at night.
While such notions may be compelling—albeit a long way yet from realization—Jackson and Gonzales demonstrated by far the most inspiring potential of EEG technology in architecture: One student member of the Mind Manipulator team has very limited physical mobility. Wheelchair-bound and barely able to move his hands to control a computer mouse, the thought-responsive panel interface enabled him to have a profound experience by manipulating the physical environment at what is, for him, an unprecedented scale and level of control.
Technologies like EEG exhibit obvious potential in universal design, empowering users to overcome physical impairments in their engagement with architecture—for example, opening doors and windows, moving lifts, or powering automated walkways with only their minds. Such a vision suggests that brain-computer interfaces may not result in the dire circumstances predicted by dystopian authors, but instead a more humane future. Indeed, through the Mind Manipulator project, we see that the truth is not only stranger than fiction—it’s also brighter.
Blaine Brownell, AIA, is a regularly featured columnist whose stories appear on this website each month. His views and conclusions are not necessarily those of ARCHITECT magazine nor of the American Institute of Architects.