Neural Scores: Mapping Music's Emotional Impact with EEG and Machine Learning
Dissertation (Sound Design BA)
Overview
Have you ever wondered what your emotions might look like while listening to your favorite song? Neural Scores is a project I developed that explores this question by turning brainwave patterns into visual art. Using the Muse headband (a consumer EEG device) and artificial intelligence, the system creates real-time visualisations that capture the emotional journey of listening to music.
At its heart is a convolutional neural network I trained to analyse brainwave patterns and map them to Russell’s circumplex model of emotion - a way of understanding our feelings through their positivity/negativity and intensity. These emotional states then drive the visualisation you see above. The circular pattern builds counterclockwise throughout the song, created from actual EEG spectrogram data. As emotions shift, both the colors laying down this pattern and the dynamic background adapt - blues might represent calm states, while reds and yellows could show more intense emotions. Each visualisation becomes a unique emotional fingerprint of that moment with that music.
This is just a quick overview of Neural Scores though - I’ve written up much more detail about the technical implementation, research process, and findings on the project website. Here are some quick links, but feel free to explore at your own pace - or vist the (Covid friendly) virtual gallery! :)
Quick Links
Neural Scores — Homepage
Neural Scores — Video Showcase
Neural Scores — Technical Explanation
Share on: