In addition to its more famous Nobel Prize-winning legacy, Bell Labs has been known since its early days for its forays into experimental art and music, and into the technology to support it.
Working together, the two groups participated in a free livestream concert from Saint Elizabeth University, in Morristown, on May 15.
NJTechWeekly.com interviewed several of the participants in this experiment to discuss how the collaboration came about, how it proceeded and what they thought of the outcome.
According to the former head of Nokia Bell Labs’ Experiments in Art and Technology (E.A.T.) program, Domhnaill Hernon, the idea of the collaboration came out of an interaction with a Stevens Institute of Technology artist, Lainie Fefferman. Hernon asked her, “What can’t you do today that you would love to do artistically?” The artist said that she wanted to make sound happen on phones indoors, making sounds change as the artist and the audience move through spaces. “We decided to use the phone in an interesting way to connect people in the audience to the artist.”
Bell Labs brought two interns on board from Stevens Institute of Technology and MIT, tasked with making this technologically “impossible” goal a reality, Hernon noted. “Low and behold, they were able to make it happen,”
The Technology Behind the Performance
Danielle McPhatter, Nokia Bell Labs creative technologist, elaborated on the process, noting that the project used C4C [Compositions for the Collective] technology that she had developed, as well as web interfaces primarily created by Madeline Wong, the Bell Labs intern from MIT.
“I had the intention of being able to send audio, or signals, or some sort of sensory experience, in this case music,” said McPhatter. “Everyone in the audience had mobile devices, and throughout the performance they would experience how the audio samples or signals would change over time based on the artist’s or performer’s intention.” She added that the task not only involved sending these signals, but also sending them in real time while streaming all of the audio live. The interns also had to make the software as modular as possible, so it could scale for other types of performances in the future.
The technology has been used in other settings, McPhatter said, but the big technological hurdle to overcome for the NJYS concert was how to make the communication more bidirectional. When the technology was first developed, “the experience was a bit more passive, and it was from the performer or the composer to the audience; but in this case, we really wanted to make that bidirectional, to give the performers a way to reach out to the audience but also give the audience the capability to talk back to the performers,” she said.
“So that’s when we enhanced a lot of the technology that already existed. It was like a blank canvas that the students were able to add on their own little sprinklings of ideas. We included enhancements using lights that were audio-visually reactive, or were tuned in to the performers’ or their audience members’ haptics,” she said. “Haptics are vibrations or tactile sensations from some of the frequencies that were going on in the space. We also needed to include biofeedback to translate things like heart rate from the audience members to the lights.”
The Symphony’s Perspective
The NJYS artistic director and principal conductor, Helen H. Cha-Pyo, noted that she had become acquainted with the E.A.T. program because the organization’s rehearsal space is nearby. “Our home is in New Providence, literally, a couple of blocks away from the physical location of Nokia Bell Labs Research Center in Murray Hill.” Cha-Pyo said that she knew someone on the NJYS board, who connected her to Hernon because that board member thought Hernon would “have some wonderful ideas for NJYS.”
From the symphony’s perspective, nothing would have been possible without the partnership with the E.A.T. program to get a grant from the American Orchestras’ Futures Fund, which allowed Cha-Pyo to pursue this highly experimental path. The orchestra first partnered with an African-American composer from New Jersey, last year, but that work was stymied because of the pandemic. They then turned to Patricio Molina, a Syrian-Chilean pianist and composer, and Mesia Austin, an African-American woman percussionist and composer, to write the COVID-friendly pieces for small ensembles.
Only six students were chosen for this project, and to work directly with E.A.T. creative technologists McPhatter and Ethan Edwards to assist with the implementation of the new technology at rehearsals. They were: trombonist Jimmy Chen, a senior at Bridgewater-Raritan High School; percussionist Abhinav Datla, a senior at Sayreville War Memorial High School; violinist Samantha Liu, a junior at Ridge High School; jazz bassist Ryoma Takenaga, a junior at the Academy for Information Technology; bassoonist Samhita Tatavarty, a junior at Ridge High School; and violinist Brian Zhang, a senior at The Academy for Mathematics, Science, and Engineering. They had the opportunity to combine their passions for technology, innovation and music in weekly sessions with the researchers.
The Student Experience
We spoke to Tatavarty about her experiences. She noted that she comes from a family of tech professionals, but that she leans more toward the arts. When she learned that she had been selected for this experiment, she was excited. “I think we all came in with the same idea. This is something we’ve never done before and it’s something we’re all really interested in. Each of us came up with our own individual ideas for it. I worked in the biofeedback area, connecting heart rate monitors to audience members and to different lights set up on different stages for the performers.” The monitors would trigger the lights to change color according to the change in heart rates. “This way the conductor could feel and visually see the way the audience felt during the concert, listening to the music.”
Cha-Pyo noted that the orchestra played their pieces first without any enhancements, and then with the sensory input. The organization then asked for audience feedback on how the attendees liked the experience. NJYS is currently compiling the data. Figuring out how to improve the experience for attendees will be a big part of the project, she said.
According to McPhatter, one of the unexpected outcomes of the musical experiment was that it gave Nokia Bell Labs engineers freedom from constraints. “It was really great to interact with the six students who were completely unbounded and unjaded by convention,” he said. “We work with a lot of engineers, and many are jaded by the constraints of the technology. They work within the limitations and don’t really think outside of the box. So, it was nice to have these uninhibited minds interacting with the technology, giving us their ideas. And these were extremely ambitious ideas, to create this interactive, immersive, concert experience. We were excited to bring those ideas to life as well.”