Neuroscience

Articles and news from the latest research reports.

Posts tagged BCMI

57 notes

The BCMI-MIdAS (Brain-Computer Music Interface for Monitoring and Inducing Affective States) project
The central purpose of the project is to develop technology for building innovative intelligent systems that can monitor our affective state, and induce specific affective states through music, automatically and adaptively. This is a highly interdisciplinary project, which will address several technical challenges at the interface between science, technology and performing arts/music (incorporating computer-generated music and machine learning).
Research questions
How can music change affective states and what are the specific musical traits (i.e., the parameters of a piece of music) that elicit such states?
How can we control such traits in a piece of music in order to induce specific affective states in a participant? 
How can we effectively detect information about affective states induced by music in the EEG signal, going beyond EEG asymmetry and characterising information contained in synchronisation patterns?
How can we use the EEG to monitor the affective state induced by music on-line (i.e., in “real-time”)?
How can we produce a generative music system capable of generating music embodying musical traits aimed at inducing specific affective states, observable in the EEG of the participant?
 How can we build an intelligent adaptive system for monitoring and inducing affective states through music on-line?

The BCMI-MIdAS (Brain-Computer Music Interface for Monitoring and Inducing Affective States) project

The central purpose of the project is to develop technology for building innovative intelligent systems that can monitor our affective state, and induce specific affective states through music, automatically and adaptively. This is a highly interdisciplinary project, which will address several technical challenges at the interface between science, technology and performing arts/music (incorporating computer-generated music and machine learning).

Research questions

  • How can music change affective states and what are the specific musical traits (i.e., the parameters of a piece of music) that elicit such states?
  • How can we control such traits in a piece of music in order to induce specific affective states in a participant?
  • How can we effectively detect information about affective states induced by music in the EEG signal, going beyond EEG asymmetry and characterising information contained in synchronisation patterns?
  • How can we use the EEG to monitor the affective state induced by music on-line (i.e., in “real-time”)?
  • How can we produce a generative music system capable of generating music embodying musical traits aimed at inducing specific affective states, observable in the EEG of the participant?
  • How can we build an intelligent adaptive system for monitoring and inducing affective states through music on-line?

(Source: cmr.soc.plymouth.ac.uk)

Filed under BCMI EEG brain brain activity mood music technology neuroscience science

free counters