CLOSE AD ×

Luisa Caldas uses AR to let DS+R's BAMPFA tell its own story

History's Mysteries

Luisa Caldas uses AR to let DS+R's BAMPFA tell its own story

Augmented Time tells the history and present of the Berkeley Museum of Art and Pacific Film Archives through AR. (Stephanie Claudino Daffara/BAMPFA)

Luisa Caldas is a professor of architecture at the University of California, Berkeley, where she leads the XR Lab, focused on using augmented reality (AR), virtual reality, and other extended reality tools as part of architectural practice. Recently, Caldas created the Augmented Time exhibition at the Berkeley Art Museum and Pacific Film Archive (BAMPFA), housed in a 2016 Diller Scofidio + Renfro-designed building in Berkeley, California. The exhibition used iPad-based augmented reality and physical artifacts to allow the narratives of the building—originally opened in 1940—and those who built it to shine through. AN spoke to Caldas about augmented storytelling, the narrative power of architecture, and what “extended reality” could mean to architects in the future.

Drew Zeiba: What was the initial inspiration behind Augmented Time?

Luisa Caldas: I was intrigued by the potential of AR to tell a story. I wanted to show a number of interwoven realities that I saw happening in this particular piece of architecture. The building was the Berkeley Printing Press, which was later abandoned and covered in graffiti, before becoming a museum designed by Diller Scofidio + Renfro. So, I saw the potential for a timeline kind of storytelling that would be engaging because the building itself was to become its own storyteller. You could embed all this multi-modal digital information that was captured in so many places and just have it congregated on the building itself.

The other motivation was to show the workers that actually built the building. I wanted to make visible those faces and those stories that, as an architect who has built buildings, I know are there. Often, all these dramas, all this magic about putting something together, completely fades away and/or is told as the work of an architect. The people who build it actually kind of disappear. 

I’m really interested in the relation between this powerful new technology to tell invisible or forgotten stories. Not just as a tool. 

I think one of the things this project touches on his how AR could shape how we think about built history, and not only frame discussions of the history of a building, but even question what “preservation” and site-specificity mean in a post-digital age. 

Totally, because a lot of the preliminary work that architects do on sites has to do with precedent, has to do with history, has to ask “What is there? How did it come to be there?” We architects always tend to do that research, but it just becomes another invisibility, unless there is a very clear reference in the building design about site context or historical context. And so it becomes our first conceptual stages, our first approaches to the site, to the building, to the program, but it just usually vanishes away. I enjoy asking how process captures or preserves or ignores or incorporates or shows that history, that resonance of the site. For me, that was very fascinating, how to embody that enquiry in this AR experience. 

It also shows the potential for AR as a tool for experiencing buildings and the built world as things that don’t just exist in a single moment, but unfold over time.

Exactly, which is such a part of human narratives, isn’t it? And it’s so many times built by layering things over one another. So, being able to peel those layers away, to turn the skin into a derma. You know, a skin is a surface, but a derma is a layered reality. That was also the idea: peeling the visible surface away and revealing those layers. 

People using iPads in a museum space.
Caldas collaborated with computer scientists and UI/UX designers to create the custom AR solution. (Oliver Moldow/BAMPFA)

Can you tell me a little more about the technical aspects of the project and the process of realizing it? 

I lead a lab of virtual and augmented reality so there was initially a discussion: “Should we have AR headsets or should we have handheld devices?” And headsets were, at the time at least and even today, not really up to what we wanted to do. Also, I like the more democratic access to the experience that the handheld device provides you. We developed the app for iPads, but we can have the app for a smartphone, so anyone can access AR, like you do popular Snapchat filters. This is a project that had to be done in augmented reality, not virtual reality, because it had to be related to the physical artifact of building. 

There was a lot of interaction with the museum about visitor access, about how to make invisible things appear in a museum. When you get to a museum you expect to see things, right? And there you want to view was not available. You have to get these devices and you have to understand where to go. That led us to a lot of research on what is called user interface and user experience (UI/UX). We had to invent this new way of showing an exhibition, and to understand how people related to the content and to the technology, and so we did two or three previews where we open the exhibit and we were there seeing what people did and how they used it in a fluid, public event. 

Of course, I had a lot of students coming up to try it in the lab, but it is very different how tech savvy students and how seniors or kids use it, for example. We saw all these people using the technology and we learned from it, and we kept refining the UI/UX. We had to create everything from scratch, really, there wasn’t a precedent—we basically invented it. 

In terms of the technical solution, we decided to go for the Apple platform. As Apple was releasing more of its technology, we were constantly adapting to what was being made possible, to create more and more ambitious projects.

Computer science at Berkeley is excellent. So I had a large team of computer scientists, architects, and also UI/UX designers, and the level of integration was very high. We met every week. Everyone was bringing ideas to the table, everybody was super excited. So there was a big integration between the creative side and the technical side. The technologists and computer scientists could come up with a really creative solution, or the architects or designers could suggest something to the computer scientists that they were not expecting. I think the team was very committed and we knew we were breaking new ground so, it was a lot of fun. 

After closing at the museum, BAMPFA AR — Augmented Time reopened at the Wurster Hall Room 108 gallery at UC Berkeley, where it will be on display until January 30. It will later travel to other locations around the country.

For more on the latest in AEC technology and for information about the upcoming TECH+ conference, visit https://techplusexpo.com/events/la/

CLOSE AD ×