A naturalistic approach to studying temporal processing during music performance

Publication Year
2021

Type

Journal Article
Abstract

In recent years, the push to embrace naturalistic stimuli over artificial designs has enriched what we know about the neural underpinnings of human attention, memory, and communication in real life. Previous work using natural stories scrambled (at the word, sentence, and paragraph level) has revealed a hierarchy of brain regions that organize natural acoustic input at these different timescales. While this approach has advanced our understanding of language processing, many fewer studies to date have explored the neural underpinnings of music perception, let alone music production, in naturalistic settings. In our novel paradigm, we asked expert pianists to play musical pieces, scrambled at different timescales (measure, phrase, section) on a non-ferromagnetic piano keyboard inside the fMRI scanner. This dataset provides unprecedented access to expert musicians’ brains starting from their first exposure to a novel piece and continuing over the course of learning to play it. We found distinct patterns of tuning to musical timescales across several clusters of brain regions (e.g., sensory/motor, parietal, and frontal/memory). We also found that musical predictability impacts functional connectivity between auditory, motor, and higher-order regions during performance. Finally, we applied several machine learning analyses to understand how the brain dynamically represents acoustic and musical features.

Journal
The Journal of the Acoustical Society of America
Volume
150
Issue
4