In 2009, Wired magazine editors Gary Wolf and Kevin Kelly coined the term “the quantified self” to describe the emerging trend in which people record their own biometric and other data—a phenomenon they called “self-knowledge through numbers.” Health-monitoring devices for humans have since proliferated, but architecture has also gotten into the game.
In 2016, International Facilities Management Association’s Erik Jaspers and Eric Teicholz called the environmental tracking phenomenon “the quantified building.” “Imagine if every asset of your building—fans, doors, furniture, coffee makers, windows—contained a tiny embedded sensor that gathered simple data to determine the actual behavior of those assets,” they wrote in that year’s March/April issue of Facility Management Journal. “Imagine having systems in place that could capture this data and use it in real-time to adjust behavior and signal human intervention as needed. Imagine being able to analyze the accumulation of this data over time to assess structural improvements and optimize operations.”
Sensors have been rapidly propagating in buildings for functions such as resource optimization (think occupancy-based lighting), problem detection (such as water leak alerts), and physical security. Early sensor installations performed one or two functions and recorded limited information. Their capabilities have since expanded broadly. The more types of environmental data collected—and the more frequent the space and time intervals—the more comprehensive a building lifelog one can construct.
Boston-based L&M Instruments envisions such a holistic record, offering one of the most sophisticated multisensor building monitoring solutions today. The Iris 8-in-1 Environmental Monitor tracks temperature, relative humidity, illuminance (lux), color spectrum, correlated color temperature, flicker, circadian rhythm, and motion. No other device on the market combines these sensing technologies, according to L&M co-founder John Waszak. With its broad capabilities, the Iris sensor is representative of the next chapter in quantified buildings.
L&M’s sensor technology has monitored UVA and UVB light exposure in museum collections for years. Subsequent requests by architects led to the addition of more monitoring capabilities, including color temperature, humidity, ambient temperature, and melanopic lux—a measure of the biological influence of light on humans (though some lighting experts dispute the metric’s credibility). L&M also incorporated a motion sensor, because the presence and movement of people influence humidity and temperature.
The combination of these sensors resulted in Iris. The module interfaces with Microsoft’s Azure Cloud, which offers “highly reliable connectivity as well as scalability,” Waszak says. The cloud connectivity allows online access to data, charts, and alerts worldwide. Setting up the Iris sensor to start collecting data is straightforward, says Chengde Wu, an associate professor at the Iowa State University College of Design and a former research fellow at the University of North Carolina at Charlotte School of Architecture, where I currently work as the director.
Wu tested the Iris platform, provided courtesy of L&M, as part of his research in UNC Charlotte’s Integrated Design Research Lab. “[S]ending data to the cloud was convenient for me to check the operation of the Iris without physically accessing the sensor,” he says. But he sees the sensor’s primary benefits to be its ability to monitor and log data on spectral light intensity and melanopic lux, as well as its integrated design.
The multisensor design does have limitations, however. “Depending on the value of what is being protected by monitoring the environment, a basic data logger might fit the bill,” Waszak says. “These devices are smaller, run battery operated for months, and are more easily mounted around an environment, particularly in small spaces.”
Still, Wu wishes the Iris included more sensors, namely to monitor carbon dioxide and VOC levels. “These sensors could provide data to ventilate spaces based on occupancy rates dynamically,” he says.
L&M has developed UV-C monitoring technology in its Apollo sensor line and could incorporate UV-C detection in future versions of the Iris. UV-C illumination is increasingly used to kill pathogens such as the SARS-CoV-2 virus, which causes COVID-19, but it must be controlled to ensure it does not pose a health risk to occupants. A combination Apollo–Iris sensor “would provide an always-on, IoT (internet of things) solution for monitoring UV-C dose in and throughout a building space,” Waszak says. Furthermore, the motion detection function of the Iris can monitor occupancy rates and manage alerts for an interior space.
The Iris 8-in-1 Environmental Monitor offers a glimpse into the full-spectrum future of the quantified built environment. According to Wu, sensors are the cornerstones of smart buildings. “As sensors become more and more affordable, accessible, and accurate, I envision that smart buildings will become the new standard in the near future,” he says. “The future is more, more, more,” Waszak says. “More sensing, more analytics, more optimization.”
The potential of sensors goes beyond improving indoor environmental quality and conserving energy. Facility managers might wish to know the cost to operate a building per day, per occupant. "Look for security cameras to add headcount tracking to understand how many people are in the office,” Waszak says, or to better understand and manage HVAC costs based on occupancy—which he describes as “the HVAC equivalent of motion-driven lighting.”
The ability to monitor airflow, occupancy, and germicidal light in real time might provide the necessary reassurance for more people to return confidently to workplaces, schools, and other common indoor spaces during the pandemic.
The views and conclusions from this author are not necessarily those of ARCHITECT magazine or of The American Institute of Architects.