Bridging the intersections between art, computer science, and human behavior

By Mena Davidson

Content warning: mention of sexual assault.

Imagine, for a moment, that you are standing in the dark, surrounded by glowing orbs of white light. Several disjointed voices call out around you, forming a disconcerting sea of meaningless words and sounds. As you approach one of the floating spheres and put your ear to it, a narrative becomes clear, outshining the gurgle of voices around you–a single voice sharing a personal experience of sexual assault. This is a glimpse into the experience of Dr. Hannen Wolfe’s interactive art installation, Cacophonic Choir, curated to emphasize the way that stories from sexual assault survivors are distorted by the media.

To design this art piece, Hannen trained neural networks to generate text at different levels of clarity, depending on a person’s distance from the sculptural “agents” generating the words and stories. A neural network is a type of machine learning algorithm–or artificial intelligence–that can be trained using input data to produce a predictive output. As Hannen explains, “For Cacophonic Choir, the input data was text from [over 500] sexual assault survivor testimonials. You could input the current word, the previous X amount of words, and then have the machine learning algorithm generate what the next word would be.” If the output matches the expectation, that pattern of response is strengthened.

As an Assistant Professor in Computer Science at Colby College, Hannen specializes in creating these interactive art exhibits driven by computer algorithms. This work requires a lot of problem-solving and collaboration across fields, from fine arts to robotics to psychology. One current collaboration involves working with an artist who used a drill to make holes in bricks to create artwork. Hannen is exploring how to turn a brick into a sensor to modify the audio component of the piece. Another used an interactive furry robot, which would make emotive sounds when touched, such as purring. A third used neural networks to generate specific reference images for printmakers–for example: an A-frame cabin along a river, with chairs out front, against a backdrop of fall foliage.

The collaborative nature of these projects reflects Hannen’s own path from undergraduate art major to computer science professor. An undergraduate class on building interactive art installations prompted a career realization that stuck with them: “Wow, I can actually combine my interests in mathematics and art to make something that’s really meaningful.” A unique combination of media arts and computer science degrees led to a fit in the Colby College Computer Science program, along with a mix of professors who apply computer science to subjects such as neuroscience, ethics, and computational biology.

Hannen is also excited to design research that examines societal and personal understandings of gender. They have blurred the lines between traditional expectations of feminine and masculine gender signifiers, developing a robot that approaches people and hits on them using pickup lines, noting that, “She [the robot] has all of these very female gender signifiers, combined with these very masculine aspects of dating culture, so it’s an interesting project.” In another installation, a variety of interactive objects will produce audio clips of nonbinary artists explaining how these items affirm or inform their gender. 

Hannen’s research is ultimately tied together with the study of communication and emotion, by use of technology. They are interested in understanding how humans interact with technology and respond to emotive sound in complex ways. These projects can be used to collect data on human behavior, using interactive sensors, overhead cameras, and surveys to study patterns of human response to stimuli generated by the art exhibits. Although the data recorded from these human interactions with art exhibits may be noisy, presenting analysis challenges, the depth and context of the interactions may lead to surprising new insights and important applications. Hannen points out, “When we interact with technology, we don’t interact with it in super controlled settings, we just interact with it in our daily lives.” 

Whether leaning in close to hear the story a glowing agent has to tell, tickling a furry robot until it purrs, or cringing away from an overly flirtatious robot, a person’s interplay and connection with a piece of technology can tell us a lot about both human behavior and artificial intelligence design. 

As artificial intelligence becomes a more central part of our society, researchers like Hannen, who integrate studies of computer science with complex human experiences like art, emotion, and gender, may help us see our diverse experiences and behavior reflected in these everyday interactions with technology.

You can check out more of Hannen’s work at their website: http://projectiveplanes.com/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: