About the Project
Beats Exposed is an interactive performance experience that breaks down the barrier between audience and performer. By exposing the body’s vital signs, the performer invites the audience to see beyond the polished act and into the extreme physical and personal effort.
Beats Exposed is built to be used in performance on, or off, stage. It is lightweight and battery powered, and therefore able to run in a variety of settings.
The current iteration of the project is performed with an aerialist. It exposes the exertion in an artform that is extremely demanding, yet typically meant to appear effortless.
The performer wears a Polar pulse sensor and Moteino wireless transceiver while performing. The transceiver communicates wirelessly with a second Moteino transceiver connected to a computer. The pulse is transferred serially to a P5 program with both audio and visualizations.
In this experience, the audience hears the sound of a heartbeat timed with the performer’s pulse. The visualization, also reacting to the pulse, projects from the ceiling onto the performer, surrounding area, and any audience members that have come in close.
The resulting experience is intimate, personal and engaging.
Here’s a video of it working in a loft in Brooklyn:
In this video, taken directly after the performance, you can see the pulse of the performer beating rapidly through the projection on the floor:
This video cycles through the P5 projections:
Beats Exposed is meant to be used in aerial performance, but not necessarily in a theater or on stage. It is meant to be able to run anywhere, as long as a rig is available.
Beats Exposed is created around the concept of connectivity. The project is designed to appeal to a wide range of people, no technology or performance understanding is necessary. The project explores how revealing the vitals of a performer can deepen the experience of a viewer, this connectivity can occur on a subconscious or conscious level, allowing access to people of all ages and experiences. On the surface it is a visually stimulating experience of adept physicality encased in projected light and designs that move with the sound of the performer’s heartbeat. Exploring deeper reveals the technology that drives the project. Similar to the heartbeat it is hidden inside a small wearable, radio waves and code.
Our method for measuring the heart beat of the performer changed drastically over the course of our testing. The first version we used worked via sticky electrodes attached to the chest and connected to an EKG frontend that measures the minute voltages generated by the heart. We ran into issues when the performer was doing anything other than being seated, as body movement generated a lot of noise. We then tried a different EKG frontend circuit which was similar to those found in exercise equipment, but again noise became a huge issue whenever the performer moved.
After ample research, we began to learn that it was necessary to measure body movement alongside the heartbeat, in order to eliminate the body movement noise from the heartbeat electrodes. We found a product–the Polar T34 pulse monitor that does this processing automatically. By using this product as our sensor, we were able to get a very accurate heartbeat pulse that was free from any noise.
Our final human interface works by receiving a pulse signal from the Polar monitor for every heartbeat, which is then transmitted wirelessly via Monteino transmitter (over a 915MHz serial bridge) that is worn by the performer. This 915MHz signal is then received by a second Moteino that sends the pulse along to P5 via serial. In order to make sure that P5 sees every pulse, each heart beat is about 320ms long.
The code behind this project is written in Arduino and P5. It is available on Github here.
Our original proposal includes a pulse sensor and breath/stretch sensor built into a wearable, and a portable lighting rig and projection screen.
It took three iterations over about one month of work to get the pulse sensor working as we intended. We decided to focus on one part of the project, rather than building all at once. Because of this, we have not yet built successfully build the breath input and output nor the portable lighting structure.
Based on feedback, we switched from projecting behind the performer, to projecting on the floor.
Thanks to the great community at ITP, we had a lot of help with the technical aspects of our project. We would like to thank ITP resident J. H. Moon for his help with our P5 code, and ITP Professor Benedetta Piantella for helping us implement the wireless serial bridge to transmit our sensors to our visuals. Many second year students were also happy to support us and answer our questions including Thea Rae, Justin Peake, and Joe Mango.