Following the HEADSPACE – Brain-Controlled Music event (earlier this month in NYC), the Emotiv EPOC neuroheadset made another debut yesterday as a ‘musical instrument ‘when renowned cellist Katinka Kleijn performed “Intelligence in the Human-Machine” – a duet for brainwaves and cello composed by Daniel R. Dehaan in collaboration with Ryan Ingebritsen. This (free!) concert took place at the Chicago Cultural Centre’s Yates Gallery on January 13. Sunday.
“Intelligence and the Human-Machine” goes a few steps above and beyond this early composition. First of all, it is one of the first performances where brainwaves will accompany a conventional instrument – the cello. Second of all, it is the first duet for brainwave and instrument in which the performer will control both instruments. According to the composer, “[The audience] will be hearing not only Katinka trying to express [things] musically, but also the result of her brain state in trying to achieve that.
In order to understand the nature of the composition, it is important to understand the nature of the brain signals that are recorded using the EPOC headset. The headset’s 14 electrodes record everything from the signals that control emotions, to the signals that control muscle movements, to mechanical noise. Additionally, these signals are very low frequency – below the range of human hearing. Consequently, a lot of sound processing is required in order to make the signal ‘musical.’ For this reason, the composer pre-recorded sets of brainwave patterns while he, Ingebritsen, and Katinka were performing a range of activities such as discussing the composition and thinking about the potential directions of the project.
These brain-wave patterns accompany a set of 20 musical gestures performed on the cello—long tones or pizzicato movement, for example, indicated by two- or three-bar snippets of conventional notation—along with 100 words, each of which asks Kleijn to “find” something, whether focus or randomness or balance or life, within whatever gesture she’s playing.
On stage, Kleijn used a pair of foot pedals to progress through the score. One pedal advances the score to the next gesture, the other to the next word. The order of presentation is determined in realtime by a computer program written by Dehaan, and is unknown to Kleijn. The headset feeds the mental activity each task generates into a computer, which controls the pre-recorded brain wave sounds—mental states the team calls “meditation,” “excitement,” “engagement,” and “frustration” are mapped onto playback speed, pitch, and other parameters.
Kleijn stated that practicing meditation has helped her sharpen her control over these mental states. “One thing we’ve discovered that’s really interesting” she expressed, “is that if I play something extremely difficult, the ‘meditation’ line goes up. I think it could be that the music was so difficult that I could only think of one thing, playing the music.”
In the future, this application of EEG technology could lead to a new class of music, in which the mental state of the performer is integrated into the music in real-time. However, in order for this to happen, a great deal of advancement in the field of real-time signal processing still needs to be made. In the clinical world, this could also prove to be extremely valuable in the field of musical therapy.
Information provided by: chicagoreader.com