October 21, 2015—the future date that Marty McFly traveled to in Back to The Future II— was almost a month ago.
What did Back to the Future II predict and what did it get wrong? While the flying cars and ubiquitous fax machines didn’t quite turn out, fingerprint sensors and video chats definitely did. But one scene isn’t too far from our current future: Marty walks through the town square, bewildered by what his town looks like 30 years in the future and he comes across Jaws 19 on the marquee of the movie theater (the same theater from 1955 and 1985). The movie is playing, and a holographic shark leaps from the building to eat Marty, who is thoroughly scared.
How close are we to this reality, where buildings and people interact through immersive, sensorial environmental features?
This technology, including light and sound components, as well as interactive hardware and software, is increasingly included in installations, exhibitions, branded environments, entertainment venues, and elsewhere. Like the Jaws 19 shark, it brings spaces closer to us through physical and sensorial interactions. These spaces rely less on traditional architectural effect and more on actively evolving a kind of engagement with space. It includes lights, sounds, smells, touchscreens, interactive content embedded in buildings, and even the integration of social media into space.
Could interactive technologies and the lessons of immersive spaces begin to offer new ways for architecture to operate in culture writ large? As this technology-architecture combination evolves, will it offer new forms of collectivity through design?
Starting December 1 in New York City, the 2015 holiday season will kick off with an installation at the Brookfield Place Winter Garden—an Archigram fantasy applied to a
classic piece of ’80s tropical historicism. Cesar Pelli designed the Crystal Palace–like space and Rockwell Lab is installing Luminaries, an interactive light sculpture composed of 650 suspended cubes that will float among the palm trees. The LED-filled cubes, or “luminoids,” as they are called in-house, will be mostly ambient until visitors control them. Visitors “make a wish” by interacting with touch-sensors embedded in three Corian wishing stations that send pulses of color-change through the installation above. The Corian touchscreens are a traditional surface material embedded with technology to augment the physical experience into an interactive one. Designers can also control the cubes individually to program a sequence, making a more choreographed performance.
The installation is designed to be downtown’s version of a holiday spectacle in the vein of Rockefeller Center’s tree-lighting, but Rockwell Lab hopes it will be an ongoing gathering place though the holidays. As users send their “wishes” through the cubes, they will interact with the environment and also with each other—they can watch others control the panels and work together to create new patterns. “Rockwell Lab is about using interior space to bring people together and ask, ‘How do people connect in space?’” said Rockwell Lab studio leader Melissa Hoffman.
Rockwell Lab has four architects, three strategists, and over 20 tech people who are working to blur the physical and digital in a myriad of situations. “In our projects, content lives in a space. We think of it often as live, digital wallpaper. It is architectural.”
Inside the Rockwell Lab at Union Square, New York, small-scale prototypes are scattered around a studio-like space. It is an ongoing physical experiment with mock-ups and prototypes littering the area, from color-changing glass to LED screens flashing GIFs. These experiments linger and offer a glimpse into the lab’s iterative design process. On one table there are tiny projectors that kiss scale models with light; elsewhere sits an Oculus Rift device that allows designers and clients to really see what the experience of their proposed spaces will be like. “We use it internally to understand, but also to have the client understand,” said Hoffman.
Design through Auralization
At Arup’s SoundLab, they are simulating sound in the same way that Oculus Rift is simulating visuals. The SoundLab is a hi-fidelity (literally), spaceship-like space with the most cutting-edge sound, visualization, and 3-D-modeling technology integrated into a presentation space that would make most corporate executives proud. Their “auralization” system allows users to hear the acoustic qualities of an imagined space in real-time, through a 3-D simulation. For example, I was treated to a video tour of Brooklyn’s new National Sawdust performance venue. As the tour twisted and turned, it ended on the balcony, where I could turn my body in real space, but the sound was coming from the same place in the virtual space. The “real me” was moving, but not the virtual stage. It is the sound equivalent of the Oculus.
While the simulation is a great tool for showing off the new building, it is also very useful on the front end for making design decisions.
The SoundLab was conceived in 2001 as the latest in the evolution of sound visualization technology that had been developing for nearly 50 years. The internal metrics that acoustic engineers were using were almost incomprehensible to outsiders. Visualization made visually tracking waves and their reflections possible, but it still didn’t accurately represent the sound in a space to clients. The “auralization” was built using anechoic chamber music that was recorded in an acoustic reflection-less space at Bell Labs in New Jersey.
This reverberation-free music is then digitally combined with data collected from a space using a “pulse” with an omnidirectional loudspeaker and microphone, or is extracted from a 3-D model given to them by the architects. The result is an acoustic virtual reality, or a map of how a sound travels in space. These audible acoustic sceneries allow designers to make architectural decisions based on qualitative factors rather than prescriptive objectives.
Building as Instrument
Brooklyn design studio Bureau V and Arup SoundLab worked closely on National Sawdust, a new music venue in Williamsburg. From the outset, Sawdust was the brainchild of attorney, organist, and philanthropist Kevin Dolan, who had a specific vision for a space that would accommodate a variety of types of live music, without compromising any. In addition, he wanted to make the space a forum for performance, recording, broadcast, and experimentation in composition—a tall task for the design team.
“[Dolan] could say what he liked, but he couldn’t design it or talk about it,” explained Arup’s head of acoustics, Raj Patel. “To get the intimate experience he wanted, we had to have the right reflection patterns.” In the end, SoundLab technology let them fine-tune the performance space of National Sawdust so that clarity, loudness, intimacy, reverberation, envelopment, and timbre could be adjusted by the artists in sound check. Dolan got what he wanted, which is a space that can be altered for both acoustic and electric performance without physically transforming the visuals in the space.
The vision plays out as an intimate venue that has the ability to shift for different exceptional sound qualities, but does not change its appearance; the architecture is the acoustics. “At the SoundLab, we see the relationship of architecture and acoustic engineering as seamless,” said Patel. “For National Sawdust, we really wanted to think of the venue as an instrument that could be tuned like any other.”
National Sawdust is located in an old sawdust factory, where a large, brick industrial structure sits over the L subway line. The performance space is acoustically isolated from the outside to keep all vibrations out and minimize background noise. Arup helped develop a box-in-a box construction that is suspended on spring isolators so that there is no shared structure between the two boxes. The rectangular space was modeled in the SoundLab to achieve ideal proportions for the types of performances Dolan requested; he was in the facility at Arup’s Financial District office from the start.
The interior box is a large, steel structure with inset laser-cut aluminum panels covered in fabric—a system designed to let sound through to the CMU wall behind. “It was important that we did not lose sound as it passed through the first layer,” said Matthew Mahon, a senior consultant on acoustics and audiovisual at Arup. However, in between the two walls is a curtain system that can be tuned for specific performances. For instance, jazz typically requires less reverberation and a dryer sound, while chamber music usually requires more sound reflection and thus a brighter, enveloping sound.
“As the audience passes through the familiar, rough, post-industrial exterior, the space reveals a pristine, jewel-like volume formed by a sculptural composite skin of patterned, perforated metal and fabric,” said Bureau V principal Peter Zuspan. “This acoustically transparent, but visually translucent skin unifies the space by collapsing the variable acoustic systems, vibration isolation system, and audio-video infrastructure into one scenographic element, thus eliding technology and aesthetics into a seamless experience for a wide variety of repertoire.”
Each wall of the interior box has a curtain that can be adjusted by hand with a pulley system during sound check, fulfilling what Patel described as the “venue-as-instrument” concept. The upper balcony also serves to increase the reflection and reverberation at the top corners of the performance space. A second series of six curtains wraps the back of the balcony space, allowing for even more control. SoundLab designed several settings that can be used depending on the type of music.
Bureau V and SoundLab were able convey the qualitative experience of being in different parts of the space. On paper, two different design schemes might be very similar, but in actuality, they could have very different experiential properties.
Small nuances in the shape of the room, the materials, and the proportions of the room or variations such as balconies make for huge differences in sound. The auralization helped Arup and Bureau V create a space that can morph into whatever the artist needs.
Future Cities Lab
If National Sawdust is a classical instrument that can be tuned, then Future Cities Lab is an open mike night mixed with a drum machine circle. Working in similar realms as SoundLab and Rockwell Lab—experimental, immersive environments that are produced by augmenting traditional architecture with interactive technology—San Francisco–based Future Cities Lab is led by design principals Jason Kelly Johnson and Nataly Gattegno. Their work integrates physical computing into architecture and represents an even more experimental realm of interactive architectural design and fabrication; as a result, it is also at a much smaller scale than Rockwell and Arup.
For Bitly’s New York office, Future Cities Lab was commissioned to build a light sculpture that would reflect the company by visualizing its data. The design was conceived in collaboration with Bitly. Programmers built an API (a small piece of code) that linked data derived from Bitly’s shortened URLs directly to FCL’s responsive installation. Each day, millions of web links are channeled through Bitly via Twitter; FCL set out to visualize this data in space.
FCL built the data visualization piece in the lobby so that the immense data set feeds LEDs inside of folded, laser cut, translucent paper diffusers. It is a living sculpture that changes in real time so the CEO can look at the sculpture and see what is happening—when, where, and how much data is being produced. 24 rows and five columns represent 24 hours in a day and five high-traffic locations. “It is a really advanced data scape of the company’s inner workings,” said Johnson. “A lot of our work is interconnected like that with the internet. We give expression to sets of data that are nested in the internet. We are not as interested in freezing forms in architecture, we are interested in letting data begin to animate and inform and become a poetic element in a building or a surface. The data is always evolving.”
“We are interested in architecture that is responsive, changing and shifting, and has an ongoing relationship with people and technology in a broader context. We see that attitude in every other allied profession around us,” Johnson said. “We see it in the automotive industry, fashion, music, video. We are interested in taking.”
He also cites Superstudio’s Continuous Monument as one of many radical ’60s and ’70s technocratic projects from which their responsive building systems have taken inspiration. In the famous set of collages, a horizontal, totalizing, gridded white architecture tears across the horizon, with nomads plugged into the landscape. “We like these utopian ideas, like, ‘How can interior public space and civic space be altered by these technologies?’”
For Future Cities Lab, the best spaces are where architecture, landscape, and interaction design are starting to fuse. “When architecture begins to engage with tech, it becomes more potent,” Johnson said. “Our cities are being handed over to engineers who don’t always understand what architecture can bring. We want to bring that ethic that has been in architecture for thousands of years, and use that as a beginning point for new ways of working.”
Their project “Murmur Wall”—on view now at Yerba Buena Center for the Arts in San Francisco—might well be a glimpse into the future of integrated design. The steel and acrylic tubing structure sits outside the Yerba Buena Center, collecting data from nearby users. Data is displayed as light pulses that become text on digital displays as they pass through, showing in real-time what people are posting on social media. It is a public installation that they think of as a monument or a fountain would have been in Ancient Rome. Their goal is to make the city more transparent by data participating in the public realm rather than hide nested in a phone. An iOS app also allows visitors to post things directly, too.
These immersive environments are evolving from simple interior spaces that envelop and engage those on the inside, into larger, more complex architectural projects that alter the ways in which we relate to buildings, and ultimately each other. In Rockwell’s case, they are using media-rich architecture to enhance the experience of their spaces and make temporary content-rich space. For Arup, the National Sawdust project showcased their ability to use technology to make design decisions for a more sensorial experience, as well as to convey that experience. Future Cities Lab is attempting to connect the spaces we encounter in the everyday even further, bringing them to life with new technologies. For all of them, immersive and interactive experience is a way forward in connecting us to architecture and the world around us.