Technology in Real Time

by Andrew Lazarow

in National Conference

Post image for Technology in Real Time

(This post is part of the 2014 TCG National Conference: Crossing Borders {Theatre | Technology} blog salon, curated by Jacqueline E. Lawton.)

“To me, the computer is just another tool. It’s like a pen. You have to have a pen, and to know penmanship, but neither will write the book for you.”
- Red Burns, The Godmother of Silicon Alley

Because of my work I am often asked how new technologies can change the theatre. However, I believe that new technologies cannot, or at least should not, be used to change it. Instead I think theatre artists should look at how to use technology in the valuable ways we always have, to bring us together.

I am a projections designer and adjunct professor at NYU’s Interactive Telecommunications Program (ITP), which we call “the center for the recently possible.” Together with our students, we create software to explore new technologies and how they can be used to further the human spirit. We invent and create designs that respond in real time to live music, peoples’ movements, newspaper headlines, changes in the weather, where an audience member is looking… We create holograms of performers and bring any shape we can imagine into existence with 3D printers. In a moment when the possibilities are endless, I find it even more necessary to focus the fundamentals of live performance.

New technology has been a part of the theatre since its emergence. Last year Alexandros Tsilfidis published a comprehensive paper about the acoustics of Ancient Greek theatre masks. We know these masks were used to carry the actors’ voices throughout the amphitheaters and to help audiences recognize characters. Moreover, as anyone who has trained in Commedia dell’arte knows, the masks also serve as an artistic tool to the performers. The line between the technological and artistic benefits of the masks is so blurred that we tend to forget they served a practical purpose at all.

This blurring of lines has been consistent throughout theatre history. A more recent example can been found in the 1960s, with the start of non-profits like Experiments in Art and Technology (EAT). Cofounded by two engineers and the artists Robert Rauschenberg and Robert Whitman, EAT’s 9 Evenings: Theatre and Engineering had scientists at Bell Laboratories working with artists to bring audiences closer to the performers’ experiences. For the first time live-feed video projection was used onstage, while wireless sound technology made its public debut, allowing audiences to hear the action without tethering actors to microphone cables.

In 1987 the French opera director Humbert Camerlo wanted to create an interdisciplinary theatre lab that went back to the fundamental and natural elements of live performance. He worked with the Irish engineer Peter Rice to create the Full Moon Theater, which uses a series of reflectors to track moonbeams and focus them onto the stage. The reflectors also track actors’ movements allowing for automated spotlights, picking up performers in a soft reflected light that is ideal for perceiving actors’ expressions and costumes’ colors.

There are many more examples of these collaborations across industries for productions, and I hope they continue. However, I am invested in bringing this approach to technology back into the hands of theatre artists and our creative processes. I find myself asking how the tools available can help writers, directors, designers, performers, and stage managers collaborate more effectively with one another. My research looks at how projections can become a scene partner in workshops and rehearsals, instead of a “locked in” package that arrives at load-in. To that end I am working on software that allows video to respond in real time to actors’ and directors’ choices. It reacts to actors’ movements, to composers’ music, and can be directed in the terms of describing movement that we already use every day in rehearsal. There is no render time; everything in rehearsal can listen and respond just like Sandy Meisner asked his actors to.

My hope is that this conversation prompts us to work together and take this question beyond my own specialty. With today’s possibilities, why can’t a writer collaborate with a scenic designer in another country to print a 3D model that could sit on her or his desk while writing? Why can’t we use movement tracking to let a choreographer and creative team work together to see how a specific number might look with a cast of 20? The possibilities really are endless. Let us reach out to one another, and use technology to work together like we always have.

Lazarow Headshot

Andrew Lazarow is a projection and interactive designer based in New York City. His upcoming projects include Andrew Lippa’s I Am Harvey Milk at Lincoln Center, staring Kristin Chenoweth, and continuing to work with Daniel Fish as his installation Eternal goes to Holland.  Previously Andrew has collaborated with such directors as Benjamin Endsley Klein, Rachel Chavkin, Daniel Goldstein, Tina Shephard, Daniel Fish, Noah Himmelstein, Gretchen Cryer, and Mallory Catlett.  Andrew has also designed arena events for His Holiness the Dalai Lama, and retail products for companies across the US. He is an adjunct professor at NYU’s Tisch, contributed John Nobbs’s book A Devil Pokes the Actor on the Suzuki Training Method, and has been recognized for Outstanding Achievement in Film by Apple. You can view his work at


  • Gwydion Suilebhan

    I love your approach, and I share your perspective. Thanks for this.