Back to Projects

DreamFREQ: Brainwave-Powered Soundscapes

IoT Wearable Tech
DreamFREQ: Brainwave-Powered Soundscapes

Overview

DreamFREQ is an innovative project that translates EEG brainwave data into real-time auditory soundscapes, offering a unique musical representation of the user's mental state. Developed during a residency with On Air Fest, the project uses a Muse EEG headband to capture brainwave activity, focusing on delta waves associated with creativity and the early stages of sleep. The brainwave data is processed using Mind Monitor and synthesized in Touch Designer, creating a dynamic audio experience that changes in real-time based on the user's brain activity. During the residency, DreamFREQ was demonstrated live at the Wythe Hotel in Brooklyn, where participants could see their brainwave activity visualized and hear it translated into sound. The project culminated in a 15-minute composition that mimics the daily cycle of waking, sleeping, and dreaming. DreamFREQ represents a fusion of neuroscience, music technology, and interactive art, offering a glimpse into the future of human-computer interaction and creative expression.

Challenge

Translating complex brainwave data into meaningful and aesthetically pleasing sound in real-time posed significant technical and creative challenges. The goal was to create a system that was both accurate and intuitive, allowing users to directly experience the connection between their brain activity and the resulting sound.

Outcomes

  • Successfully demonstrated live at On Air Fest, engaging dozens of participants
  • Created a 15-minute composition representing the cycle of waking, sleeping, and dreaming
  • Developed a real-time synthesizer using Touch Designer and Mind Monitor
  • Generated meaningful conversations about the future of brainwave-based art and technology
  • Plans to expand the project by mapping all five brainwave types to different audio filters