FOREST
FOREST is the performative outcome of an NSF funded project aimed at enhancing trust between humans and robots through sound and gesture. As part of the project, a deep learning network was trained to generate emotion carrying sounds to accompany robotic gestures. A rule-based AI system was developed for generating emotional human-inspired gestures for non-anthropomorphic robots. The award winning performance aims to create trustful connections between humans and robotic musicians and dancers, which could inspire novel creative and artistic ideas for both machines and humans.

Students: Amit Rogel, Richard Savery, Jumbo Qian, Mohammad Jafari, Jocelyn Kavanagh, Michael Verma, Rose Sun, Qinying Lei, Nitin Hugar
Collaborators: Ivan Pulinkala, Dash Smith, Jason Barnes, Christina Massad, Darvensky Louis, Ellie Olszeski, Beech Crosby, Bailey Harbaugh