The research promises to help us understand both how the mind reacts to music and how future brain interfaces could be developed to help people who can’t communicate in the usual way – like those with locked-in syndrome, for example.
Further down the line we could even be composing songs using our thoughts, according to the international team of researchers behind the study, though that kind of sci-fi concept is still some time away.
“Although it could be a long way to achieve these advanced reconstruction approaches, we present here as a first step in this direction an approach to identification of novel music pieces,” write the researchers in their published paper.
The experiments relied on an encoding-decoding model, where a computer system monitored the brain activity patterns caused by particular songs – which parts of the mind lit up and when – and then tried to identify the right song again just from the fMRI data.
Six volunteers were played 40 pieces of music covering classical music, rock, pop, jazz, and others. Software hooked up to the fMRI scanner was trained to measure brain activity against musical features including tonality, dynamics, rhythm and timbre.