Describing how we experience a space moment by moment and step by step is impossible even for the most well-versed of architects and planners. Simply too much information inundates us that can be perceived and processed consciously. This spring, the Cloud Lab at Columbia University’s Graduate School of Architecture, Planning, and Preservation (GSAPP) and the Van Alen Institute tackled the challenge of assessing and mapping how people respond to their environment as a part of Van Alen’s Elsewhere series on wellness in the city.
Instead of the typical focus groups and surveys, however, the researchers tracked brainwaves to gauge the mental activities of nearly 100 volunteers who, in response to a call by Van Alen, navigated their way through New York. The cross-section of participants surprised the researchers. “We expected mostly architects and then … in every group there were artists and neuroscientists,” says Cloud Lab co-director Mark Collins.
The researchers, which included Cloud Lab co-director Toru Hasegawa and Columbia University neuroscientist and biomedical engineer David Jangraw, outfitted each participant with a NeuroSky MindWave Mobile headset device, a relatively inexpensive brain computer interface (BCI). Each person also carried a mobile device with a custom app to track their location and heading so that their brainwaves could be synced with location.
After a day of training and experimentation with the BCIs, groups of 10 to 12 participants took predefined walks through a three-square-block area in New York’s DUMBO neighborhood in Brooklyn. The neighborhood was chosen for its “clean block structure,” Collins says, but also for its bisection by the Manhattan Bridge. The routes took participants through different urban environments, including city blocks, road intersections, under the bridge, and other urban infrastructure. “We thought this was a fun environment to set people loose in,” he says.
Using electroencephalography-based (EEG) measurements and the GPS tracking app, the research team collected more than 1 gigabyte of data over 200 walking sessions that, in theory, create a snapshot of a day-in-the-life of the neighborhood’s mental states. Data recorded by the BCIs included the frequency of the alpha, beta, and gamma waves, which are associated with different forms of brain activity. Alpha waves, for example, occur when one closes his or her eyes. Complex algorithms developed by the device maker, NeuroSky, translated the brainwaves to the simple indicators of interest and attention.
Presenting the data in a manner that retained its spatial qualities required the researchers to develop their own software for visualization. At a public follow-up presentation in May, the team presented the simplified data on a 3D map of DUMBO. Areas in cyan indicate places in which participants were in a more meditative and relaxed state, while areas in red indicate places where participants had a more focused or heightened sense of awareness.
“We saw a lot of interesting things,” Collins says. Crosshairs of red were common at road intersections where people would “come up for air” from their meditative states and direct their attention faculties to navigate the crosswalk. Underneath the Manhattan Bridge overpass and walking adjacent to a power transmission station were other common red zones. In the midspan of blocks, people would often enter a meditative state; Collins notes that while the city was planned centuries ago, this stretch could now serve as ad hoc “cellphone time” during which everyday pedestrians can become lost in conversation. “For us, the purpose is not to say that this is good block, this is a bad block,” Collins says. “But you [begin] to understand what kind of mental processing we are bringing to navigating a city.”
Collins acknowledges that interpreting the brainwave data required some subjectivity and that measurements are not always consistent. Jostling, talking, or even the amount of sweat on one’s brow could affect the headset’s measurements. One person’s tense brainwaves could be another person’s neutral state. The data collected is currently being evaluated by the project’s neuroscience collaborators, who are more accustomed to processing large amounts of data gathered on a small number of research subjects in a controlled laboratory setting rather than a small volume of data on a large number of participants released “out in the wild,” Collins says. If the data and programming code to interpret the data are validated, the researchers hope to release both—with the data being made anonymous—to the public.
Deploying BCIs to large groups of people outside the laboratory has great potential in architecture and planning, Collins says. Next, using research-grade equipment—BCI technology is advancing so quickly that the devices used in this project were even obsolete by its end—he hopes to track the P300 waveform. The P300 essentially serves as a tag for when one’s brain seeks and recognizes something at the subconscious level, triggering an “ah-ha” or “oh-no” moment, Collins says. A vehicle equipped with a BCI, for example, could track your input signals and “be ready to take control or respond to you in some novel way.” Architects and planners could employ the technology during post-occupancy walkthroughs or preliminary design presentations.
Architects, planners, and researchers could begin mapping where “space is actually responding to the availability of the data in real time,” Collins says. “It’s more than just traffic lights. As more of the city becomes mutable or programmable, the benefits start to become more obvious.”