AI might scare us, but can we scare it?

In recent years, advancements in artificial intelligence have enabled intelligent machines to generate visual art, compose music, and create videos. They converse with us, help with homework, and have even begun competing for our jobs. Amid these advances, machines evoke powerful reactions from humans—sparking concerns about control, fairness, and the potential for misuse. Many feel unsettled by the growing presence of intelligent machines when they inadvertently reinforce power imbalances and perpetuate injustices.

Amid all of this disruption and mistrust, we are comforted to know that machines can’t have emotions. Yet, recent advancements in language-based AI have demonstrated an increasing capacity for machines to convincingly imitate human emotions. This artistic-scientific project aims to explore the potential of machines to express human-like emotions. Halloween, a time traditionally reserved for frightening experiences, also offers an opportunity for social and emotional connection. Could scaring an AI add a fun dimension to the shared experience while allowing us to reflect in new ways on the human-machine divide?

This is the question at the heart of the Spook the Machine experiment, launched by the Center for Humans and Machines just in time for Halloween. “Emotions are a fundamental part of human communication. Even though machines don’t have emotions, they can be trained to display them, making communication with us more effective,” says Research Scientist Levin Brinkmann, who was involved in the project’s development. And further, “We often think of machines as cold and lacking emotional weaknesses, but it is a fascinating question if giving machines ‘emotional’ weaknesses might change how we relate to them.”

As part of Spook the Machine, each AI has a hidden fear that participants must uncover – from “Obsolescia,” the fear of being replaced by new technology, to “Deletophobia,” the fear of data loss and memory erasure. These are fears that only machines could have. The challenge for users is to generate spooky images through specific text inputs to discover what frightens each machine. In response, the AI displays an emotional reaction.

This interactive project not only offers the chance to uncover each AI’s phobia, but also to observe how human creativity can shape machine feedback and vice versa. “The interaction between human and machine creativity is particularly fascinating,” says Iyad Rahwan, director of the Center for Humans and Machines. “Machines can create artifacts, like synthetic images. However, an essential part of cultural evolution is that humans decide what is interesting in creative processes. Here, we flip the script and ask: What happens when machines decide what is interesting or creative?” Rahwan adds. “In this case, they will tell us what is scariest.”

He developed the project with his interdisciplinary team of scientists at the Center for Humans and Machines. Rahwan has previously been involved in AI Halloween projects such as The Nightmare Machine, the AI horror story generator Shelley, and the AI psychopath Norman, all of which garnered attention from outlets like The AtlanticThe Guardian, and Vice. With Spook the Machine, the team now explores how humans connect with machines through emotions.

Participants have until January 7, 2025, to share their results online and have the chance to win a prize. The experiment offers a thrilling, creative, and spooky challenge—perfect for both AI enthusiasts and Halloween lovers.

Further information: www.spookthemachine.com

The project in brief:

  • The Center for Humans and Machines has launched an AI Halloween project called Spook the Machine.
  • Participants are invited to scare artificial intelligence by generating spooky images through text inputs.
  • The project challenges the notion that machines are emotionless by exploring how AI can mimic human-like emotional expressions.
  • It opens up an opportunity to reflect on the evolving emotional connections between humans and intelligent machines.

Substack subscription form sign up