Scholarly article on topic 'Differential parietal and temporal contributions to music perception in improvising and score-dependent musicians, an fMRI study'

Differential parietal and temporal contributions to music perception in improvising and score-dependent musicians, an fMRI study Academic research paper on "Clinical medicine"

CC BY-NC-ND
0
0
Share paper
Academic journal
Brain Research
OECD Field of science
Keywords
{"Music improvisation" / Score-dependency / "Aural perception" / "Audiomotor transformation" / "Parietal cortex" / "Premotor cortex"}

Abstract of research paper on Clinical medicine, author of scientific article — Robert Harris, Bauke M. de Jong

Abstract Using fMRI, cerebral activations were studied in 24 classically-trained keyboard performers and 12 musically unskilled control subjects. Two groups of musicians were recruited: improvising (n=12) and score-dependent (non-improvising) musicians (n=12). While listening to both familiar and unfamiliar music, subjects either (covertly) appraised the presented music performance or imagined they were playing the music themselves. We hypothesized that improvising musicians would exhibit enhanced efficiency of audiomotor transformation reflected by stronger ventral premotor activation. Statistical Parametric Mapping revealed that, while virtually ‘playing along׳ with the music, improvising musicians exhibited activation of a right-hemisphere distribution of cerebral areas including posterior-superior parietal and dorsal premotor cortex. Involvement of these right-hemisphere dorsal stream areas suggests that improvising musicians recruited an amodal spatial processing system subserving pitch-to-space transformations to facilitate their virtual motor performance. Score-dependent musicians recruited a primarily left-hemisphere pattern of motor areas together with the posterior part of the right superior temporal sulcus, suggesting a relationship between aural discrimination and symbolic representation. Activations in bilateral auditory cortex were significantly larger for improvising musicians than for score-dependent musicians, suggesting enhanced top-down effects on aural perception. Our results suggest that learning to play a music instrument primarily from notation predisposes musicians toward aural identification and discrimination, while learning by improvisation involves audio-spatial-motor transformations, not only during performance, but also perception.

Academic research paper on topic "Differential parietal and temporal contributions to music perception in improvising and score-dependent musicians, an fMRI study"

ELSEVIER

Available online at www.sciencedirect.com

ScienceDirect

www.elsevier.com/locate/brainres

Research Report

Differential parietal and temporal contributions to music perception in improvising and score-dependent musicians, an fMRI study

Robert Harrisa'b'c'*, Bauke M. de Jonga'b

aDepartment of Neurology, University Medical Center Groningen, University of Groningen, Hanzeplein 1, P.O. Box 30.001, 9700RB Groningen, The Netherlands

bBCN Neuroimaging Center, University of Groningen, A. Deusinglaan 2, 9713AW Groningen, The Netherlands cPrince Claus Conservatoire, Hanze University of Applied Sciences, Veemarktstraat 76, 9724 GA Groningen, The Netherlands

ARTICLE INFO ABSTRACT

CrossMark

Article history: Accepted 30 June 2015 Available online 20 July 2015

Keywords:

Music improvisation Score-dependency Aural perception Audiomotor transformation Parietal cortex Premotor cortex

Using fMRI, cerebral activations were studied in 24 classically-trained keyboard performers and 12 musically unskilled control subjects. Two groups of musicians were recruited: improvising (n = 12) and score-dependent (non-improvising) musicians (n = 12). While listening to both familiar and unfamiliar music, subjects either (covertly) appraised the presented music performance or imagined they were playing the music themselves. We hypothesized that improvising musicians would exhibit enhanced efficiency of audiomotor transformation reflected by stronger ventral premotor activation. Statistical Parametric Mapping revealed that, while virtually 'playing along' with the music, improvising musicians exhibited activation of a right-hemisphere distribution of cerebral areas including posterior-superior parietal and dorsal premotor cortex. Involvement of these right-hemisphere dorsal stream areas suggests that improvising musicians recruited an amodal spatial processing system subserving pitch-to-space transformations to facilitate their virtual motor performance. Score-dependent musicians recruited a primarily left-hemisphere pattern of motor areas together with the posterior part of the right superior temporal sulcus, suggesting a relationship between aural discrimination and symbolic representation. Activations in bilateral auditory cortex were significantly larger for improvising musicians than for score-dependent musicians, suggesting enhanced top-down effects on aural perception. Our results suggest that learning to play a music instrument primarily from notation predisposes musicians toward aural identification and discrimination, while learning by improvisation involves audio-spatial-motor transformations, not only during performance, but also perception.

© 2015 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

""Corresponding author at: Department of Neurology, University Medical Center Groningen, Hanzeplein 1, P.O. Box 30.001, 9700RB Groningen, The Netherlands.

E-mail addresses: r.i.harris@pl.hanze.nl (R. Harris), b.m.de.jong@umcg.nl (B.M. de Jong).

http://dx.doi.org/10.1016/j.brainres.2015.06.050

0006-8993/© 2015 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

PPH Mmi F1H f&ma^im

AtLis vs baseline sound

p<0.05 (FWE) k8

vs MUC

p<0.001 (uncorr.) k8

vs IPRO

p<0.001 (uncorr.) k8

iWri rhrfif+lr+i

Molm-AtLis Molm-AtLis Molm-AtLis fa un fa un fa un fa un fa un fa un

Fig. 1 - Cerebral activations related to Motor Imagery and Attentive Listening. The upper panel (A-E) shows MoIm-related increases of regional activation in improvising and score-dependent musicians rendered onto the surface of a standard anatomical brain volume (Montreal Neurological Institute, SPM 2005). The lower panel (A-C) shows AtLis-related increases of regional activation. Statistical significance levels are indicated at the bottom of each column. p < 0.05 (FWE) k8 denotes cluster-level significance, FWE-corrected, for the entire brain volume, extent threshold 8 voxels. p< 0.001 (uncorr.) k8 denotes voxel-level significance without brain volume correction. (F) Plots show contrast estimates and 90% confidence interval for effects of interest at right parietal and auditory foci of activation. MoIm=Motor Imagery, AtLis=Attentive Listening, IPRO=improvising musicians (n=12), SD=score-dependent musicians (n=12), MUC=musically unskilled control subjects (n=12), post. SPC=posterior superior parietal cortex, Audit.ctx=auditory cortex, R = right hemisphere, fa=familiar music, un unfamiliar music.

1. Introduction

Music has been studied in a variety of disciplines ranging from aesthetics (Reitsma et al., 2014) and musicology (Patel et al., 2006) to kinesiology (Toiviainen et al., 2010), physiology (Jabusch et al., 2005), and psychology (Juslin and Sloboda, 2011). However, it is not only a unique object of study in itself, it also offers a novel perspective on the functional organization of the brain (Zatorre, 2005). Behavioral and neuroscientific studies have used music to study motor control (Shaffer, 1981), mental rotation (Cupchik et al., 2001), imitation (Jones, 2006), plasticity (Münte et al., 2002; Gaser and Schlaug, 2003; D'Ausilio et al., 2006; Jäncke, 2009; Habib and Besson, 2009; Herholz and Zatorre, 2012), cortical organization (Jäncke et al., 2006), and even the evolution of language (Fitch, 2010). A particularly fruitful approach has been the study of audiomotor integration in music performance. The interaction between auditory and motor domains has gained relevance since the discovery of neurons with both sensory and motor properties in area F5 of the Macaque (Di Pellegrino et al., 1992). Subsequent identification of a subclass of audiovisuomotor mirror neurons (Kohler et al., 2002) further supported the concept that there is no strict anatomic and functional distinction between perception and action (Friston, 2010). Functional brain imaging has offered evidence of an analogous mirror neuron system in humans (Pulvermüller et al., 2006; Aglioti and Pazzaglia, 2010).

Studies of perceptuomotor integration in music, however, have mainly studied classically-trained musicians, frequently contrasted with non-musicians (Ohnishi et al., 2001; Baumann et al., 2007; Haslinger et al., 2005; Drost et al., 2005; Bangert et al., 2006; Mutschler et al., 2007; Herholz et al., 2008; Trimarchi and Luzzatti, 2011; Novembre and Keller, 2011; Stewart et al., 2013). This comparison restricts the scope of the study of the neural substrate of musical skill considerably. On a global scale, or in historical perspective, the Western classical performer is a unique category, differing from every other type of musician in the past and present (Nettl and Russell, 1998). While musicians all over the world learn and learned their art by imitation and improvisation, classical musicians learn to play their instrument from a printed music score (sheet music), right from the beginning of their musical training (Stewart et al., 2003). They are thus de facto score-dependent, meaning they always learn and/or perform their repertoire from sheet music, though commonly committing it to memory and performing by heart. The cultural bias toward visuomotor processing apparent in these studies has become evident from the fact that most have investigated activations elicited either by the visual perception of printed notation (Itoh et al., 2001) or the aural perception of memorized music (Parsons et al., 2005).

Various studies have demonstrated that frontoparietal activations could be elicited by aural perception of rehearsed

Table 1 - Regional activations related to Motor Imagery compared to baseline sound.

A. Improvising musicians

Left Right

Region BA x y z T-value extent x y z T-value extent

IFG 47 -36 26 2 5.15 14

STG 22 -52 -16 6 15.63 5608 58 -6 4 19.85 3220

vPMC / dPMC 44/6 -50 10 24 10.94 s.c. 54 4 48 15.79 2062

ant.PC 40 -42 - 48 48 6.66 579 50 - 34 54 6.82 309

post. SPC 7 -30 - 66 56 5.70 16 26 - 68 54 6.47 228

cerebellum -28 - 58 -26 6.12 58 30 -58 - 26 7.41 141

B. Score-dependent musicians

Region x y z T-value extent x y z T-value extent

IFG 47 -32 32 2 5.67 36

STG 22 -50 -14 6 15.09 5489 58 -6 4 14.65 3274

dPMC (vPMC) 6 - 48 2 48 13.54 s.c. 50 6 54 9.22 580

ant. PC 40 - 44 - 40 48 7.82 s.c.

post. SPC 7 - 24 - 68 56 5.60 29

SMA 6 2 4 64 9.53 749

cerebellum - 26 - 58 - 26 6.46 103 30 -58 - 26 8.09 308

p<0.05 (FWE-corrected); extent in voxels k = 8

Task-related cerebral activations in improvising and score-dependent musicians. Coordinates refer to the voxel of maximum activation within a significant cluster. If the cluster includes a second local maximum more than 8.0 mm apart, this region is reported if it is also identified by other comparisons. The indicated regions correspond with Fig. 1. The x,y,z co-ordinates (in mm) are relative to the middle of the anterior commissure. Left and Right indicate the side of the brain in which activations were identified.

IFG=inferior frontal gyrus; dPMC = dorsal premotor cortex; vPMC=ventral premotor cortex; STG = superior temporal gyrus (auditory cortex); ant.PC = anterior parietal cortex; post.SPC = posterior superior parietal cortex; SMA=supplementary motor area; BA=Brodmann's Area; s.c. = same cluster.

but not unrehearsed music (Bangert and Altenmüller, 2003; Mutschler et al., 2007; D'Ausilio et al., 2006; Lahav et al., 2007). The fact that unrehearsed music did not elicit significant activation is quite plausible in light of the fact that classically-trained musicians are generally unable to play by ear. However, we hypothesized that in improvising musicians, frontoparietal activations would be elicited by unrehearsed pieces as well. This is in line with studies demonstrating that frontoparietal activations can be elicited by unrehearsed speech (Watkins and Paus, 2004; Wilson et al., 2008) or by a familiar style of dance (Calvo-Merino et al., 2005). It is also possible that frontoparietal activation in response to a novel stimulus might differ significantly from activation elicited by well-rehearsed stimuli. Posterior parietal and dorsal premotor cortices have been implicated in novel spatial transformations into motor commands (Johnson et al., 1996; Wise et al., 1997; Sakata et al., 1997). It could be hypothesized that frontoparietal activations in response to unrehearsed music would involve this network.

In the present study, using fMRI, we contrasted cerebral activations in improvising musicians (n = 12) with activations in score-dependent musicians (n=12) under two conditions: 1) Motor Imagery (listening to music and imagining that one is playing it) and 2) Attentive Listening (listening in order to assess the performance). During Motor Imagery, subjects were instructed to imagine playing the music they heard, without overt hand movement. During Attentive Listening, they were instructed to covertly verbalize their assessment, a task designed to stimulate aural discrimination without inducing virtual performance. Our main hypothesis was that,

during virtual keyboard playing, enhanced efficiency of audiomotor transformation in improvising musicians would evoke stronger ventral premotor activation when compared with score-dependent musicians. In order to counter activations caused primarily by prior rehearsal of the music, aural stimuli included not only relatively familiar excerpts subjects had played a number of times previously, but also completely unfamiliar pieces, composed specifically for the experiment. The two groups of musicians were further contrasted with a third group of musically-unskilled control subjects (n =12).

2. Results

Both in improvising and score-dependent musicians, a common distribution of Motor Imagery-related activations (when compared to baseline sound) was observed in the auditory and premotor cortex (PMC) of both hemispheres, the left parietal cortex, and bilateral cerebellum (Fig. 1A; Table 1). Improvising musicians exhibited additional right-hemisphere activations at both posterior-superior and anterior parietal locations which were not seen in score-dependent musicians.

2.1. Motor Imagery

Comparing Motor Imagery in improvising musicians with the same condition in musically unskilled controls revealed extensive bilateral PMC and right parietal activations (p<0.05, cluster-corrected), together with activation in the auditory cortex which only reached this cluster-corrected

Table 2 - Regional activations related to Motor Imagery, comparisons between groups.

A. Improvising musicians vs. musically unskilled controls

Left Right

Region BA x y z T-value extent x y z T-value extent

dPMC-s 6 30 -6 70 6.93 1768.

dPMC-i 6 -24 -4 54 5.17 267 52 4 48 7.49 s.c.

vPMC 6/44 -50 6 26 6.39 816 54 12 26 8.31 s.c.

STG 42 -58 - 14 12 5.26 272

SPC 7 26 - 68 54 5.83 1767

B. Score-dependent musicians vs. musically unskilled controls

Region x y z T-value extent x y z T-value extent

dPMC-i 6 -48 2 48 7.07 687

STS 22 64 - 42 6 5.02 344

C. Improvising vs. score-dependent musicians

Region x y z T-value extent x y z T-value extent

dPMC-s 6 28 -6 68 4.86 216

p<0.05 (cluster-corrected) following voxel-level p< 0.001; extent in voxels k = 8

Conventions are similar to Table 1. In addition to improvising and score -dependent musicians, the group of musically unskilled controls is

included.

dPMC-s = superior part of dorsal premotor cortex; dPMC-i = inferior ; part of dPMC; STG = superior temporal gyrus (auditory cortex);

SPC = superior parietal cortex; STS = superior temporal sulcus; BA=Brodmann's Area; s.c. = same cluster.

Table 3 - Regional activations related to Motor Imagery compared to Attentive Listening

A Improvising musicians

Left Right

Region BA x y z T-value extent x y z T-value extent

dPMC-s 6 - 22 0 72 8.54 606 26 - 6 68 7.62 604

vPMC 6 -56 10 34 6.88 253

ant. PC 40 - 40 - 34 42 6.50 235 42 - 30 40 6.10 212

post. SPC 7 - 20 - 70 56 6.57 98 24 - 68 54 7.44 182

precuneus 19 - 26 - 78 44 5.34 14

lat.occip. sulc. 19 - 26 - 68 30 5.1 8

B Score-dependent musicians

Region x y z T-value extent x y z T-value extent

dPMC-s 6 - 24 2 64 7.41 526 28 - 4 56 5.41 21

vPMC 6/44 - 54 8 30 6.51 202

SMA 6 8 - 2 58 5.21 42

ant. PC 40 - 44 - 38 48 7.16 546

post. SPC 7 - 24 - 68 56 6.50 80 18 - 60 56 5.40 31

p<0.05 (FWE-corrected); extent in voxels k = 8

Conventions are similar to Table 1.

dPMC-s = superior part of dorsal premotor cortex; vPMC =ventral premotor cortex; ant.PC = anterior parietal cortex; post.SPC =posterior

superior parietal cortex; lat.occip.sulcus=lateral occipital sulcus; SMA = supplementary motor area; BA=Brodmann's Area.

significance level in the left hemisphere (Fig. 1B; Table 2 A). Within the PMC, a general division of three main foci could be discerned. Aside from the ventral PMC (vPMC), the dorsal PMC (dPMC) could be split in superior and inferior segments. The right parietal activation was characterized by a posterior-superior maximum adjacent to anterior parietal activation, consisting of two centers along the intraparietal sulcus also identified in the contrast of Motor Imagery with baseline in improvising musicians. In score-dependent musicians, the comparison with musically unskilled controls resulted in significant Motor Imagery-related activations in the inferior

part of left dPMC and along the right posterior superior temporal sulcus (STS) (Fig. 1B, Table 2B). The patterns of activation that resulted from contrasting each group of musicians with the musically unskilled controls (at p< 0.001, uncorrected), further highlights both overlap and differences between the groups (Fig. 1B).

When contrasting improvising musicians directly with score-dependent musicians, significant Motor Imagery-related activation was seen in the superior part of the right dPMC (p<0.05, cluster-corrected) (Table 2C) while, at a lower threshold, Motor Imagery-related increases were additionally

Table 4 - Regional activations related to Attentive Listening, comparison between groups.

A Improvising musicians vs. musically unskilled controls

Left Right

Region BA x y z T-value extent x y z T-value extent

dPMC-i 6 - 50 -4 46 4.80 395 52 2 48 6.67 511

dPMC-s 6 30 -4 70 4.79 s.c.

vPMC 44 52 12 26 6.99 393

SMG 40 52 - 34 16 4.77 840

ant. PC 40 60 - 34 42 4.61 s.c.

STG 42 - 58 -14 14 5.08 248

B Score-dependent musicians vs. musically unskilled controls

Region x y z T-value extent x y z T-value extent

dPMC-i 6 - 48 2 48 5.10 491

vPMC 6 - 58 8 12 3.98 s.c. 54 14 22 5.18 292

SMG 40 - 54 - 36 28 4.81 215

STS 22 66 - 44 8 6.11 773

p<0.05 (cluster-corrected) following voxel-level p<0.001; extent in voxels k=8

Conventions are similar to Table 1.

dPMC-s = superior part of dorsal premotor cortex; dPMC-i = inferior part of dPMC; vPMC = ventral premotor cortex; SMG=supramarginal gyrus; ant. PC = anterior parietal cortex; STG = superior temporal gyrus (auditory cortex); STS = superior temporal sulcus; BA=Brodmann's Area; s.c. = same cluster.

observed in right ventral PMC, Supplementary Motor Area (SMA) and predominantly right auditory cortex, further suggesting a right-hemisphere bias (Fig. 1C, upper panel). Plots of activations in the auditory cortex indicated not only that activations in improvising musicians were significantly larger than in score-dependent musicians, but also that, in the right hemisphere, similar magnitudes were seen for score-dependent musicians and musically unskilled controls (Fig. 1F). No significant increases in Motor Imagery-related activations were found when score-dependent musicians were contrasted to improvising musicians, even at a lower threshold: p<0.001 (uncorrected).

Contrasting Motor Imagery with Attentive Listening in both improvising and score-dependent musicians revealed significant activations specifically attributed to Motor Imagery (Fig. 1D). While a shared pattern of left-hemisphere areas was observed in both dorsal and ventral PMC and parietal regions, improvising musicians exhibited more robust activation of right dPMC as well as the posterior-superior and anterior parietal cortex (Table 3). Direct comparison of tasks (Motor Imagery > Attentive Listening) between groups (improvising musicians > score-dependent musicians) underscored the improvisation-specific relation of activations in the right posterior-superior parietal cortex and superior part of dorsal PMC with Motor Imagery (Fig. 1E and F).

2.2. Attentive Listening (aural assessment)

In both improvising and score-dependent musicians, the distribution of Attentive Listening-related activations exhibited a general similarity to the Motor Imagery-related activations (Fig. 1A, lower panel) with one exception: the parietal and most dorsal PMC regions identified in Motor Imagery contributed little or none to the pattern of activations seen in Attentive Listening. On the other hand, when contrasted to Attentive Listening in musically unskilled controls,

Fig. 2 - Cerebral activations related to familiar music contrasted with unfamiliar music. (A) At p<0.05 (cluster corrected), significant left vPMC activation was seen when comparing familiar with unfamiliar music in all musicians. At p< 0.001 (uncorrected; k=40) two additional activations were seen in left posterior SPC (A) and left pallidum (B). Plots show the contrast estimates and 90% confidence interval for the effects of interest at the (x,y,z) coordinates in left vPMC (A) illustrating increased responses to familiar music, but only in musicians, which is stronger in MoIm than in AtLis; while the pallidum effect (B) is restricted to MoIm but similarly present in all three groups. vPMC=ventral premotor cortex; other abbreviation are explained in legends Fig. 1.

significant parietal activations were seen in improvising musicians, albeit restricted to right anterior parietal regions (Fig. 1B, lower panel; Table 4). While no significant Motor Imagery-related activation increases were seen in score-dependent musicians when contrasted directly with

improvising musicians, a specifically Attentive Listening-related increase of activation (p< 0.001, uncorrected) was found along the right STS (Fig. 1C lower panel; x 66, y - 46, z 8; T 3.75).

Comparison of score-dependent musicians with musically unskilled controls further underscored the dominant involvement of the right STS in score-dependent musicians, not only in Attentive Listening (Table 4B) but also in Motor Imagery (Fig. 1B, upper panel; Table 2B). Comparison of Attentive Listening-related activations in improvising musicians with those in score-dependent musicians indicated a similar right-hemisphere dominance as seen in Motor Imagery (Fig. 1C, lower panel). These activations, however, did not reach corrected cluster-level significance, although the effect particularly in the inferior part of the dorsal PMC (x 52, y 2, z 48; T 5.35) may be considered robust (p=0.037, cluster-level uncorrected).

2.3. Familiarity

While effects in the regions implicated in Motor Imagery were generally similar for familiar and unfamiliar music, we also found activations elicited more by familiar than by unfamiliar music. Contrasting the familiar conditions (Motor Imagery and Attentive Listening) with the unfamiliar conditions in all musicians together revealed significant activation (p<0.05, cluster-corrected) in the left ventral PMC (x - 48, y 6, z 28; T 4.88), while at p<0.001 (uncorrected; k=40) additional activations were seen in the left posterior-superior parietal cortex and left pallidum (Fig. 2). Plotting these effects demonstrated the following intriguing dissociation: familiarity in the left ventral PMC was associated with both Motor Imagery and Attentive Listening, but only in musicians, while the increase of left pallidum activation (x - 20, y - 2, z 6; T 4.24) was restricted to familiar Motor Imagery, but with a similar magnitude in all three groups, including musically unskilled controls (Fig. 2).

3. Discussion

Both Motor Imagery and Attentive Listening activated bilateral premotor and auditory cortex, implicating a common processing mechanism. The absence of parietal activation during Attentive Listening, however, suggests that the aural discrimination task used here was capable of facilitating preferential processing within the ventral stream while Motor Imagery facilitated processing within the dorsal stream, however to very different extents in the two populations of musicians. The participation of the putative dorsal stream network subserving motor control (Johnson et al., 1996; Sakata et al., 1997; Wise et al., 1997) suggests a role for spatial processing in the aural perception of music in the context of motor imagery and performance.

3.1. Spatial processing

While score-dependent musicians exhibited no significant Motor Imagery-related activation of right parietal cortex, improvising musicians exhibited robust activation, particularly of right posterior-superior parietal cortex. This

activation is especially interesting in the light of the ongoing debate concerning the lateralization of mental rotation and spatial attention (Corballis, 1997; Ditunno and Mann, 1990; Corballis and Sergent, 1989; Jordan et al., 2001). Various EEG, PET, and rTMS studies have demonstrated right-parietal involvement in mental rotation (De Jong et al., 1999; Harris et al., 2000; Johnson et al., 2002; Zacks et al., 2003), particularly in males (Hugdahl et al., 2006). Using rTMS, Harris and Miniussi (2003) found interference with mental rotation when disrupting activity in the right superior parietal lobe (SPL) in a time window of 400-600 ms after stimulus onset. Podzebenko et al. (2002) found right dominance in otherwise bilateral processing. Milivojevic et al. (2009) found significantly faster processing in right parietal cortex than in left, which would suggest functional participation of right parietal cortex in spatially-driven motor processes within the dorsal stream.

Kosslyn (1987) suggested that lateralization of language and spatial attention could be functionally complementary. This proposal has found recent support in a study demonstrating that left-handed individuals with right-hemisphere lateralization for language exhibit complementary left-hemisphere lateralization for spatial attention (Cai et al., 2013). Functions attributed to right SPL include spatial attention (Dehaene et al., 2003), mental rotation (Jeannerod et al., 1995; Beste et al., 2010; Gogos et al., 2010), the discrimination of auditory streams (Cusack, 2005), numerical distance (Pinel et al., 2001), computational approximation (Stanescu-Cosson et al., 2000), degree of luminance (Pinel et al., 2004), and the spatial representation of numbers (Göbel et al., 2006).

3.2. Pitch-to-space transformations

Behavioral studies have recently revived the discussion concerning spatial aspects of pitch. In the original studies, higher pitches were perceived as emanating from spatially higher origins (Pratt, 1930; Roffler and Butler, 1968) while Mudd (1963) found a diagonal correspondence between pitch and real space, higher pitches being up and to the right, and lower pitches down to the left. Rusconi et al. (2006) essentially replicated that result, finding higher accuracy and faster response when response to low pitch was performed with a spatially lower key and vice versa, but also when response to a low pitch was performed with a spatially left key and vice versa, irrespective of which hand performed the response. Using a similar paradigm, Lidji et al. (2007) observed a possible effect of musical training. Using a modified Stroop test, Stewart et al. (2004) found vertical-to-horizontal visuo-motor mapping in pianists, while Taylor and Witt (2015) found that pianists responded faster to visual stimuli when the movement towards the stimulus corresponded with the direction of the scale they heard.

More recently, the processing of music permutations has been associated with mental rotation (Cupchik et al., 2001). Musical processing functions have been attributed to right intraparietal sulcus (IPS), including retrograde musical transformations (Zatorre et al., 2010), transposition (Foster and Zatorre, 2010), and the transformation of pitches into spatial coordinates (Brown et al., 2013). Rauschecker (2014) suggests a function for posterior parietal cortex in learning and storing musical sequences.

The observed activations in right anterior and posterior-superior parietal cortex therefore suggest that improvising musicians engage a neural system dedicated not only to spatial attention and mental rotation, but also to pitch-to-space transformations and musical permutations. This system has been shown to be an integral component of the dorsal stream processing network subserving motor control (Johnson et al., 1996; Sakata et al., 1997; Wise et al., 1997). The concurrent activations in right dPMC may thus form a part of this network. Although we did not perform a formal functional connectivity analysis to quantify dynamics of cortico-cortical interactions, the coherent distribution of task-related parietal-premotor activations points towards common functional involvement mediated by a dedicated network. The absence of significant right-hemisphere parietal activations in score-dependent musicians suggests that they are less able to realize the pitch-to-space transformations necessary for an appropriate motor response to the aural perception of music. That does not exclude the possibility that they might exhibit similar right-hemisphere parietal activations to the visual perception of the music score, an hypothesis which has been tested in a number of studies.

Using PET, Sergent et al. (1992) found activation of right posterior-superior parietal cortex while pianists sight-read music (played an unfamiliar piece from notes). Using fMRI, Schön et al. (2002) found right-lateralized activation of posterior-superior parietal cortex and IPS when contrasting the sight-reading of music with the reading of numerical notation, while Stewart et al. (2003) found activation of bilateral posterior-superior parietal cortex after a short training period in which beginners were taught to play piano from notation.

3.3. Top-down effects

The role the putative dorsal network may play in aural processing becomes apparent upon examination of the activations of auditory cortex across groups. In contrast with both score-dependent musicians and musically unskilled controls, improvising musicians exhibited significantly larger activation of bilateral auditory cortex, suggesting not only specialization of auditory cortex for music processing but possibly also enhanced top-down effects on the processing of music in auditory association areas as well. In fact, without exception, no significant differences were found between the auditory activations in score-dependent musicians and musically unskilled controls, indicating that score-dependent musicians were not experiencing any benefit from top-down effects on aural processing, deriving from expertise.

This is in line with the results of Vuust et al. (2012) who, in their comparison of jazz musicians with classical musicians, found enhanced brain response (mismatch negativity) not only to pitch, but also timbre, location, intensity, and rhythm. In a behavioral study, Woody and Lehmann (2010) demonstrated that 'vernacular' i.e. improvising musicians outperformed 'formal' i.e. score-dependent musicians in aural learning, the latter requiring twice the number of trials to achieve accuracy in vocal reproduction of a melody and almost three times as many trials to achieve accuracy in instrumental reproduction of a melody (by ear).

3.4. Ventral stream processing

Evidence for preferential reliance on ventral stream processing in score-dependent musicians is suggested not only by the absence of Motor Imagery-related activation in right parietal cortex, but also by the activation of right STS under both conditions as well as significant Attentive Listening-related activation of right inferior frontal areas which have been identified as the source of ERAN to music-syntactical deviants (Koelsch, 2006). Right posterior STS, an area homologous to the location of activations elicited by the categorical discrimination of phonemes in left STS (Liebenthal et al., 2005), has been shown to be activated during processing of categorical discrimination of chords (Klein and Zatorre, 2011).

Musicians are trained to use pitch, interval, and chord labels to discriminate not only the aural signal, but also music notation and, in the case of pianists and organists, the keys of the instrument. It could be argued that, in score-dependent musicians, the aural perception of music is channeled primarily through this symbolic system and therefore preferentially processed in left-hemisphere perisylvian areas normally involved in speech. By virtue of their training, improvising musicians process music in this putative left-hemisphere network as well, however not exclusively.

3.5. Bimanual motor imagery

An important feature of the experimental design was the use of two-part harmony designed to elicit bimanual motor imagery. The historical development of the keyboard with low-frequency pitches on the left and high-frequency pitches on the right can be seen as a cultural phenomenon based on the biologically determined pitch-to-space correspondence evident in right SPL (Brown et al., 2013). Similarly, the frequent assignment of the most virtuosic role to the right hand can be viewed as a consequence of left-hemisphere dominance, not only for language, but also for handedness and manual dexterity (Knecht et al., 2000).

It is, however, also tempting to consider the role of the left hand in Western keyboard music, as well as the bass line in general, in the light of the right-hemisphere dominance for pitch-to-space transformations, with respect to contralateral control of the left hand. The bass line in Western music has achieved a unique status as the literal foundation of the harmonic structure. Adding a bass line to a melody almost unequivocally defines the harmony. The complex spatial demands made on motor control by the harmonic structure are completely different from the demands made by predominantly right-hand motor sequences like scales and passages. The fact that, in keyboard music, this role has been delegated to the left hand could therefore also be a consequence of right-hemisphere dominance for spatial attention. The significantly enhanced activation of right posterior-superior parietal cortex in improvising musicians suggests top-down influences of implicit harmonic knowledge on the perception of music, making it more than plausible that they should be superior in realizing music based on the aural signal alone, without the aid of the score.

3.6. Left hemisphere dominance

It is possible that the results of the present study could be construed as support for the idea that score-dependent musicians process music in the left-hemisphere, and improvising musicians in the right-hemisphere. However, the only consistent difference in left-hemisphere activations between the two groups was a significantly larger activation of left auditory cortex in improvising musicians. The substantial right-lateralized parietal and premotor activations revealed in improvising musicians should therefore be seen against the background of the left-hemisphere activations shared by both improvising and score-dependent musicians alike.

The significantly larger activations of improvising musicians in right hemisphere, in comparison with musically unskilled controls (Harris and de Jong, 2014), might also seem to conflict with the general view that expert musicians process music in the left hemisphere and non-musicians in the right (Fabbro et al., 1990; Evers et al., 1999; Itoh et al., 2001; Bhattacharya and Petsche, 2005; Ellis et al., 2012; Ellis et al., 2013). The results of the present study suggest, however, that right-hemisphere dominance in improvising musicians is associated with their unique ability to perform audio-spatial-motor transformations on music they hear, transformations that are associated with a dedicated right-hemisphere dorsal network of parietal and premotor areas.

What the results, however, do not suggest is that left-hemisphere processing of music is unique to score-dependent musicians. On the contrary, the extent of Motor Imagery-related activation of left parietal cortex was significantly larger in improvising musicians and, as far as premotor activation in left hemisphere is concerned, there was no significant difference in extent or location. Therefore the left-hemisphere dominance consistently found in (score-dependent) classical musicians in the past apparently points to important processing mechanisms shared by improvising and score-dependent musicians alike. The original argument attempting to account for left-hemisphere processing in expert musicians was that trained musicians listened analytically while naive listeners listened holistically (Bever and Chiarello, 1974). The present finding that professional, conservatory-trained, improvising musicians exhibit significantly larger activations in the right hemisphere than non-musicians, places caveats on that argument. A more salient proposition is perhaps the parallel with left-hemisphere dominance in tool manipulation (Choi et al., 2001; Johnson-Frey et al., 2005; Lewis, 2006). From that perspective, music instruments could essentially be seen as tools, extending the range of human musical possibilities beyond the voice (Leman, 2008). The idea of a functional relation between the voice and a music instrument would be consistent with the common left hemisphere dominance in manual dexterity and language processing (Ambrose, 2010; Pulvermuller and Fadiga, 2010).

3.7. Familiarity

No significant difference was found between the two groups of musicians in activations elicited by familiar and unfamiliar music, suggesting that both improvising and score-dependent musicians process familiar music in a similar

manner. The finding that familiar music was specifically associated with activation of a left-hemisphere distribution of parietal and ventral premotor areas, additionally associated with basal ganglia activation (Shadmehr and Krakauer, 2008), suggests that the recognition of rehearsed music (Lohse et al., 2014) involves top-down recruitment of the motor repertoire involved in playing the pieces. As in language and tool-use (Lewis, 2006), these activations appear to be left-lateralized, suggesting furthermore that the putative left-lateralized auditory mirror neuron system could be involved in the recognition of familiar auditory sequences (Aziz-Zadeh et al., 2004; Gazzola et al., 2006).

4. Conclusions

The results of this study suggest that during Motor Imagery-elicited aural processing, expert musicians share a left-lateralized network of cerebral areas dedicated to manual dexterity and music notation, but that improvising musicians additionally recruit a right-lateralized cerebral network dedicated to spatially-driven motor control. This network functions as an amodal processing system subserving pitch-to-real-space transformations. It can be argued that enhanced bilateral activation of auditory association areas in improvising musicians during both Motor Imagery and Attentive Listening is an effect of top-down influences on aural perception primarily due to the participation of this right-lateralized dorsal-stream network.

5. Experimental methods

The study was approved by the Medical Ethics Committee of the University Medical Center Groningen, Groningen, The Netherlands. All subjects gave written informed consent in accordance with the Declaration of Helsinki (2008), prior to participation.

5.1. Subjects

The two groups of professional musicians were included on the basis of their self-reported ability or inability to improvise, a criterion which we consider to be a strong index of (non) score-dependency. There was, however, no difference between groups with regard to their ability to read music notation, all participants having studied classical piano or pipe organ at the conservatory. The improvising musicians (mean age 43.3y (+14.5); 27-68 years; 12 male, 11 right-handed) were recruited, with the exception of one pianist, from the population of church organists, who in the Netherlands have largely maintained the practice of improvisation that was prevalent in the eighteenth century, but has virtually disappeared among classically-trained pianists. The population of improvising church organists in the Netherlands is largely male, making it necessary to recruit predominantly male pianists in order to avoid a gender confound. Twelve score-dependent pianists (mean age 37.1 (+ 11.5); 2159 years; 10 right-handed; 11 male) reported that they were not improvising performers. Musically unskilled control

subjects (mean age 43.7y (+9.6); 26-63 years; 12 male; 11 right-handed) reported being unable to play a music instrument. None of the subjects had neurological, ophthalmologic, or upper extremity disorders.

5.2. Experimental paradigm

Subjects performed one of two mental tasks while listening to polyphonic excerpts consisting of two voices of equal rhythmic and melodic salience. They were instructed either to imagine playing the presented music (Motor Imagery) on a keyboard instrument without overt movement or to give an ongoing commentary on the performance (Attentive Listening) without overt vocalization. The latter was designed to distract attention from the hands, thus enhancing motor-specific aspects of auditory-motor transformation in Motor Imagery, when contrasted to Attentive Listening. Activations attributed to covert vocalization during Attentive Listening could be expected to be similar in musicians and non-musicians.

Twenty-four of the 48 music excerpts were completely unfamiliar, having been composed specifically for the experiment by the researchers. The 24 'familiar' excerpts were selected mainly from the 18th century repertoire. Two weeks prior to scanning, sheet music of the 'familiar' excerpts was given to both groups of musicians while musically unskilled controls received a Compact Disc. To ensure familiarity, musicians were instructed to play through these pieces daily, while non-musicians were instructed to listen to the CD daily, keeping track of the number of times they did so. Prior to scanning, subjects were requested to rate the level of acquaintance with the 24 'familiar' pieces on a three-point scale (3=good, 2=moderate, 1 = poor). Improvising musicians played through each piece on average 5.2 times (+3.5); score-dependent musicians played through each piece on average 5.1 times (+ 1.6); while the mean number of times musically unskilled controls had listened to the CD of 'familiar' music excerpts in the weeks prior to scanning was 13.6 (+7.8). The resulting mean reported familiarity with these stimuli was 2.3 (+ 0.83) in improvising musicians, 2.5 (+ 0.35) in score-dependent musicians, and 2.2 (+0.67) in musically unskilled controls.

To avoid activations evoked just by the sound of one's own instrument, music excerpts were recorded on brass instruments, the bass voice on trombone or euphonium, the treble voice on trumpet or cornet. Students of the Prince Claus Conservatoire recorded these pieces in the sound studio of the School of Performing Arts, Hanze University of Applied Sciences, Groningen. Minor mistakes in interpretation, timing, and intonation were not edited out, allowing room for critical assessment of performance in the second task (Attentive Listening). Recordings were edited to uniform 26 s lengths in the studio, including a 2 s fade-out, and then normalized (max. amplitude -12 dB, Mazzoni normalization using Audacity) and saved in a Waveform audio file format (WAV). For a baseline condition we used a recording of natural sound (waves of the sea), edited to 14 s length including a 2 s fade-out and saved as non-normalized WAV audio file. Finally, oral commands were recorded and saved as WAV audio files.

Prior to scanning, an oral instruction was given concerning the two tasks. During the acquisition of MR images, each music

excerpt was presented once, embedded in a 48 s cycle containing one of two short (three-syllable) oral commands indicating the task, either Motor Imagery or Attentive Listening, followed by the music excerpt and the baseline sound bite (waves of the sea). The timing was as follows: 2 s command, 2 s silence, 26 s music presentation, 2 s silence, 14 s baseline sound (waves of the sea) and 2 s silence. Four cycles were grouped together in one block, containing all four experimental conditions in random order: (1) Motor Imagery familiar, (2) Motor Imagery unfamiliar, (3) Attentive Listening familiar, and (4) Attentive Listening unfamiliar. In addition, the order of both familiar and unfamiliar musical excerpts was randomized for each subject. Twelve blocks were presented in two runs lasting 20 min each, between which a T1 weighted 3D anatomic scan was acquired. After the conclusion of the scan, a debriefing was conducted, inquiring into the performance of the tasks. Musicians reported continuous bimanual imaging during Motor Imagery. During Attentive Listening, covert comments mostly concerned synchronization, intonation, articulation, and style.

Subjects were placed supinely in the bore of a 3 T MR system (Philips Intera, Best, Netherlands), which was equipped with an 8-channel phased-array (SENSE) transmit/ receive head coil. Lights were turned off and the subject was instructed to keep the eyes closed and not to move during the scan. Hands were positioned on white cushions, visible to the researchers on a television screen. This allowed monitoring of undesired hand movements which, however, were not detected during any of the scans.

Sparse sampling of fMRI data started 12 s after the onset of each cycle, lasting 2 s, and was repeated at regular 16 s intervals, meaning that 2 s bursts of scanner noise were audible 8 s after onset of each music excerpt and again during music fade-out and during fade-out of baseline sound. Subjects listened by means of MR-compatible electrodynamic headphones (MR Confon GmbH, Magdeburg, Germany) that were connected to a standard PC with soundcard. The amplitude of the audio reception was attenuated by 5%. Before each scan, a sound-check was conducted to verify proper volume and stereo presentation by the headphones. Stimuli were delivered using Presentation 14.9.

The functional imaging session was divided into two twenty-minute runs, each consisting of 75 identical highresolution T2n-sensitive gradient-echo echo-planar imaging (EPI) volume acquisitions (39 slices; repetition time: 16.0 s; echo time 30 ms; flip angle 90°; matrix 256 x 256 in axial orientation; resolution 3.5 x 3.5 x 3.5 mm). The acquisition volume was positioned in an oblique axial orientation, tilted backward, parallel to the AC-PC line. The first three scans, prior to the presentation of the stimuli, were only used to achieve stable image contrast and to trigger the start of stimulus delivery. These scans were discarded.

5.3. Analysis

Image processing and statistical analysis were conducted with Statistical Parametric Mapping (SPM) version 5 (2005, Wellcome Department of Cognitive Neurology, London, UK; http://www.fil.ion.ucl.ac.uk/spm), running in Matlab (The MathWorks Inc., Natick, MA). The functional imaging volumes were first corrected for motion effects using 3D rigid

body transformations. The anatomical images were coregis-tered to the functional volumes, and all images were normalized into Montreal Neurological Institute stereotaxic space and moderately smoothed using a Gaussian filter of 8 mm full width at half maximum (FWHM).

Cortical activations were rendered onto the surface of a standard MNI brain. For the projection on brain slices, we used the standard MNI brain as well as the mean of the normalized anatomical images obtained from the studied subjects. For the statistical analysis of regional differences in cerebral activation, all conditions were modeled in a blocked design at subject level. To identify the distributions of activations related to cerebral processing beyond primary auditory processing in conditions one to four, each of these four conditions was contrasted to baseline interval of natural sound at subject level, after which each contrast was separately analyzed at group level (second level: flexible factorial design; subject, group, condition) using one-sample t-tests. Differences between conditions Motor Imagery (1,2) and Attentive Listening (3,4) within each group, and for each of these conditions between groups, were analyzed by making the specific comparisons at second level. Similarly, the effect of familiarity (1,3) versus novelty (2,4) was assessed. The resulting set of voxel values for the indicated contrasts constituted the associated SPM of the t-statistics (SPM<T>).

Thresholds were initially set at voxel response height p<0.001 (uncorrected) with extent threshold k=8 voxels. As particularly within-group comparisons resulted in regional activations that fused into confluent clusters, a FWE-corrected voxel threshold of p<0.05 (k=8) was applied for these comparisons, demarcating independent clusters of significant activation (p<0.05, volume corrected). For between-group comparisons, clusters resulting from voxel-level analysis at p< 0.001 (uncorrected) k=8, were subsequently assessed for statistical significance after brain volume correction (p<0.05). Conditions were assumed to be dependent and equally variant, whereas subjects were assumed to be independent and equally variant within each of the three groups. Plotting the condition effects for regional activations related to Motor Imagery and Attentive Listening, respectively, enabled the assessment of possible interdepen-dency with the level of familiarity or novelty.

Acknowledgments

The authors are indebted to Hendrik de Boer, Wilbert Zwier, Jantina Rozema, and Salim Khan for their performances of the music excerpts, Klaas Pot for the recording and editing, Anita Kuipers for the MR scanning, and Dr. Remco Renken for his support with the statistical analysis. Financial support was obtained from the Gratama Foundation (Grant no. 201017), Harlingen, The Netherlands; the Hanze University of Applied Sciences, Groningen, The Netherlands; and Prince Claus Conservatoire and the Research Group Lifelong Learning in Music, Groningen, The Netherlands. Funding sources had no involvement in the study design, collection, analysis or interpretation of data, in the writing of the report, or in the decision to submit for publication.

REFERENCES

Aglioti, S.M., Pazzaglia, M., 2010. Representing actions through their sound. Exp.Brain Res. 206 (2), 141-151.

Ambrose, S.H., 2010. Coevolution of composite tool technology, constructive memory, and language. Curr. Anthropol. 51 (S1), S135-S147.

Aziz-Zadeh, L., Iacoboni, M., Zaidel, E., Wilson, S., Mazziotta, J., 2004. Left hemisphere motor facilitation in response to manual action sounds. Eur. J. Neurosci. 19 (9), 2609-2612.

Bangert, M., Altenmiiller, E.O., 2003. Mapping perception to action in piano practice: a longitudinal DC-EEG study. BMC Neurosci. 4 (1), 26.

Bangert, M., Peschel, T., Schlaug, G., Rotte, M., Drescher, D., Hinrichs, H., Heinze, H.-J., Altenmuller, E., 2006. Shared networks for auditory and motor processing in professional pianists: evidence from fMRI conjunction. Neurolmage 30 (3), 917-926.

Baumann, S., Koeneke, S., Schmidt, C.F., Meyer, M., Lutz, K., jancke, L., 2007. A network for audio-motor coordination in skilled pianists and non-musicians. Brain Res. 1161, 65-78.

Beste, C., Heil, M., Konrad, C., 2010. Individual differences in ERPs during mental rotation of characters: lateralization, and performance level. Brain Cogn. 72 (2), 238-243.

Bever, T.G., Chiarello, R.J., 1974. Cerebral dominance in musicians and non-musicians. Science 185 (4150), 537-539.

Bhattacharya, J., Petsche, H., 2005. Phase synchrony analysis of EEG during music perception reveals changes in functional connectivity due to musical expertise. Signal Process. 85 (11), 2161-2177.

Brown, R.M., Chen, J.L., Hollinger, A., Penhune, V.B., Palmer, C., Zatorre, R.J., 2013. Repetition suppression in auditory-motor regions to pitch and temporal structure in music. J. Cogn. Neurosci. 25 (2), 313-328.

Cai, Q., Van der Haegen, L., Brysbaert, M., 2013. Complementary hemispheric specialization for language production and visuospatial attention. Proc. Natl. Acad. Sci. 110 (4), E322-E330.

Calvo-Merino, B., Glaser, D.E., Griezes, J., Passingham, R.E., Haggard, P., 2005. Action observation and acquired motor skills: an fMRI study with expert dancers. Cereb. Cortex 15 (8), 1243-1249.

Choi, S., Na, D.L., Kang, E., Lee, K., Lee, S., Na, D., 2001. Functional magnetic resonance imaging during pantomiming tool-use gestures. Exp. Brain Res. 139 (3), 311-317.

Corballis, M.C., Sergent, J., 1989. Hemispheric specialization for mental rotation. Cortex 25 (1), 15-25.

Corballis, M.C., 1997. Mental rotation and the right hemisphere. Brain Lang. 57 (1), 100-121.

Cupchik, G.C., Phillips, K., Hill, D.S., 2001. Shared processes in spatial rotation and musical permutation. Brain Cogn. 46 (3), 373-382.

Cusack, R., 2005. The intraparietal sulcus and perceptual organization. J. Cogn. Neurosci. 17 (4), 641-651.

D'Ausilio, A., Altenmuller, E., Olivetti Belardinelli, M., Lotze, M., 2006. Cross-modal plasticity of the motor cortex while listening to a rehearsed musical piece. Eur. J. Neurosci. 24 (3), 955-958.

De Jong, B.M., Frackowiak, R.S.J., Willemsen, A.T.M., Paans, A.M.J., 1999. The distribution of cerebral activity related to visuomotor coordination indicating perceptual and executional specialization. Cogn. Brain Res. 8 (1), 45-59.

Dehaene, S., Piazza, M., Pinel, P., Cohen, L., 2003. Three parietal circuits for number processing. Cogn. Neuropsychol. 20 (3-6), 487-506.

Di Pellegrino, G., Fadiga, L., Fogassi, L., Gallese, V., Rizzolatti, G., 1992. Understanding motor events: a neurophysiological study. Exp. Brain Res. 91 (1), 176-180.

Ditunno, P.L., Mann, V.A., 1990. Right hemisphere specialization for mental rotation in normals and brain damaged subjects. Cortex 26 (2), 177-188.

Drost, U.C., Rieger, M., Brass, M., Gunter, T.C., Prinz, W., 2005. When hearing turns into playing: movement induction by auditory stimuli in pianists. Q. J. Exp. Psychol. Sect. A 58 (8), 1376-1389.

Ellis, R.J., Norton, A.C., Overy, K., Winner, E., Alsop, D.C., Schlaug, G., 2012. Differentiating maturational and training influences on fMRI activation during music processing. Neurolmage 60 (3), 1902-1912.

Ellis, R.J., Bruijn, B., Norton, A.C., Winner, E., Schlaug, G., 2013. Training-mediated leftward asymmetries during music processing: a cross-sectional and longitudinal fMRI analysis. Neurolmage 75, 97-107.

Evers, S., Dannert, J., Rodding, D., Rotter, G., Ringelstein, E.B., 1999. The cerebral haemodynamics of music perception: a transcranial Doppler sonography study Brain 122 (1), 75-85.

Fabbro, F., Brusaferro, A., Bava, A., 1990. Opposite musicalmanual interference in young vs. expert musicians. Neuropsychologia 28 (8), 871-877.

Fitch, W.T., 2010. In: The Evolution of Language. Cambridge University Press, Cambridge.

Foster, N.E., Zatorre, R.J., 2010. A role for the intraparietal sulcus in transforming musical pitch information. Cereb. Cortex 20 (6), 1350-1359.

Friston, K., 2010. The free-energy principle: a unified brain theory?. Nat. Rev. Neurosci. 11 (2), 127-138.

Gaser, C., Schlaug, G., 2003. Brain structures differ between

musicians and non-musicians. J. Neurosci. 23 (27), 9240-9245.

Gazzola, V., Aziz-Zadeh, L., Keysers, C., 2006. Empathy and the somatotopic auditory mirror system in humans. Curr. Biol. 16 (18), 1824-1829.

Gobel, S.M., Calabria, M., Farne, A., Rossetti, Y., 2006. Parietal rTMS distorts the mental number line: simulating 'spatial' neglect in healthy subjects. Neuropsychologia 44 (6), 860-868.

Gogos, A., Gavrilescu, M., Davison, S., Searle, K., Adams, J., Rossell, S.L., Bell, R., Davis, S.R., Egan, G.F., 2010. Greater superior than inferior parietal lobule activation with increasing rotation angle during mental rotation: an fMRI study. Neuropsychologia 48 (2), 529-535.

Habib, M., Besson, M., 2009. What do music training and musical experience teach us about brain plasticity?. Music Percept. 26, 279-285.

Harris, I.M., Egan, G.F., Sonkkila, C., Tochon-Danguy, H.J., Paxinos,

G., Watson, J.D., 2000. Selective right parietal lobe activation during mental rotation: a parametric PET study. Brain 123 (1), 65-73.

Harris, I.M., Miniussi, C., 2003. Parietal lobe contribution to mental rotation demonstrated with rTMS. J. Cogn. Neurosci. 15 (3), 315-323.

Harris, R., de Jong, B.M., 2014. Cerebral activations related to audition-driven performance imagery in professional musicians. PloS one 9 (4), e93681.

Haslinger, B., Erhard, P., Altenmiiller, E., Schroeder, U., Boecker,

H., Ceballos-Baumann, A.O., 2005. Transmodal sensorimotor networks during action observation in professional pianists. J. Cogn. Neurosci. 17 (2), 282-293.

Herholz, S.C., Lappe, C., Knief, A., Pantev, C., 2008. Neural basis of music imagery and the effect of musical expertise. Eur. J. Neurosci. 28 (11), 2352-2360.

Herholz, S.C., Zatorre, R.J., 2012. Musical training as a framework for brain plasticity: behavior, function, and structure. Neuron 76 (3), 486-502.

Hugdahl, K., Thomsen, T., Ersland, L., 2006. Sex differences in visuo-spatial processing: an fMRI study of mental rotation. Neuropsychologia 44 (9), 1575-1583.

Itoh, K., Fujii, Y., Suzuki, K., Nakada, T., 2001. Asymmetry of parietal lobe activation during piano performance: a high field functional magnetic resonance imaging study. Neurosci. Lett. 309 (1), 41-44.

Jabusch, H.C., Zschucke, D., Schmidt, A., Schuele, S., Altenmuller, E., 2005. Focal dystonia in musicians: treatment strategies and long-term outcome in 144 patients. Mov. Disord. 20 (12), 1623-1626.

Jancke, L., Baumann, S., Koeneke, S., Meyer, M., Laeng, B., Peters, M., Lutz, K., 2006. Neural control of playing a reversed piano: empirical evidence for an unusual cortical organization of musical functions. Neuroreport 17 (4), 447-451.

Jancke, L., 2009. Music drives brain plasticity. F1000 Biol. Rep. 1, 78. <http://dx.doi.org/10.3410/B1-78>.

Jeannerod, M., Arbib, M.A., Rizzolatti, G., Sakata, H., 1995. Grasping objects: the cortical mechanisms of visuomotor transformation. Trends Neurosci. 18 (7), 314-320.

Johnson, P.B., Ferraina, S., Bianchi, L., Caminiti, R., 1996. Cortical networks for visual reaching: physiological and anatomical organization of frontal and parietal lobe arm regions. Cereb. Cortex 6 (2), 102-119.

Johnson, B.W., McKenzie, K.J., Hamm, J.P., 2002. Cerebral asymmetry for mental rotation: effects of response hand, handedness and gender. Neuroreport 13 (15), 1929-1932.

Johnson-Frey, S.H., Newman-Norlund, R., Grafton, S.T., 2005. A distributed left hemisphere network active during planning of everyday tool use skills. Cereb. Cortex 15 (6), 681-695.

Jones, S.S., 2006. Exploration or imitation? The effect of music on 4-week-old infants' tongue protrusions. Infant Behav. Dev. 29 (1), 126-130.

Jordan, K., Heinze, H.J., Lutz, K., Kanowski, M., Jancke, L., 2001. Cortical activations during the mental rotation of different visual objects. NeuroImage 13 (1), 143-152.

Juslin, P.N., Sloboda, J. (Eds.), 2011. Handbook of Music and Emotion: Theory, Research, Applications. Oxford University Press, Oxford.

Klein, M.E., Zatorre, R.J., 2011. A role for the right superior

temporal sulcus in categorical perception of musical chords. Neuropsychologia 49 (5), 878-887.

Knecht, S., Drager, B., Deppe, M., Bobe, L., Lohmann, H., Floel, A., Ringelstein, E.-B., Henningsen, H., 2000. Handedness and hemispheric language dominance in healthy humans. Brain 123 (12), 2512-2518.

Koelsch, S., 2006. Significance of Broca's area and ventral

premotor cortex for music-syntactic processing. Cortex 42 (4), 518-520.

Kohler, E., Keysers, C., Umilta, M.A., Fogassi, L., Gallese, V., Rizzolatti, G., 2002. Hearing sounds, understanding actions: action representation in mirror neurons. Science 297 (5582), 846-848.

Kosslyn, S.M., 1987. Seeing and imagining in the cerebral

hemispheres: a computational approach. Psychol. Rev. 94 (2), 148-175.

Lahav, A., Saltzman, E., Schlaug, G., 2007. Action representation of sound: audiomotor recognition network while listening to newly acquired actions. J. Neurosci. 27 (2), 308-314.

Leman, M., 2008. In: Embodied Music Cognition and Mediation Technology. MIT Press, Cambridge, MA.

Lewis, J.W., 2006. Cortical networks related to human use of tools. Neuroscientist 12 (3), 211-231.

Lidji, P., Kolinsky, R., Lochy, A., Morais, J., 2007. Spatial

associations for musical stimuli: a piano in the head? J. Exp. Psychol.: Human Percept. Perform. 33 (5), 1189-1207.

Liebenthal, E., Binder, J.R., Spitzer, S.M., Possing, E.T., Medler, D.A., 2005. Neural substrates of phonemic perception. Cereb. Cortex 15 (10), 1621-1631.

Lohse, K.R., Wadden, K., Boyd, L.A., Hodges, N.J., 2014. Motor skill acquisition across short and long time scales: a meta-analysis of neuroimaging data. Neuropsychologia 59, 130-141.

Milivojevic, B., Hamm, J.P., Corballis, M.C., 2009. Hemispheric dominance for mental rotation: it is a matter of time. Neuroreport 20 (17), 1507-1512.

Mudd, S.A., 1963. Spatial stereotypes of four dimensions of pure tone. J. Exp. Psychol. 66 (4), 347-352.

Munte, T.F., Altenmuller, E., Jancke, L., 2002. The musician's brain as a model of neuroplasticity. Nat. Rev. Neurosci. 3 (6), 473-478.

Mutschler, I., Schulze-Bonhage, A., Glauche, V., Demandt, E., Speck, O., Ball, T., 2007. A rapid sound-action association effect in human insular cortex. PloS one 2 (2), e259.

Nettl, B., Russell, M. (Eds.), 1998. In the Course of Performance: Studies in the World of Musical Improvisation. University of Chicago Press, Chicago.

Novembre, G., Keller, P.E., 2011. A grammar of action generates predictions in skilled musicians. Conscious. Cogn. 20 (4), 1232-1243.

Ohnishi, T., Matsuda, H., Asada, T., Aruga, M., Hirakata, M., Nishikawa, M., Imabayashi, E., 2001. Functional anatomy of musical perception in musicians. Cereb. Cortex 11 (8), 754-760.

Parsons, L.M., Sergent, J., Hodges, D.A., Fox, P.T., 2005. The brain basis of piano performance. Neuropsychologia 43 (2), 199-215.

Patel, A.D., Iversen, J.R., Rosenberg, J.C., 2006. Comparing the rhythm and melody of speech and music: the case of British English and French. J. Acoust. Soc. Am. 119 (5), 3034-3047.

Pinel, P., Dehaene, S., Riviere, D., LeBihan, D., 2001. Modulation of parietal activation by semantic distance in a number comparison task. NeuroImage 14 (5), 1013-1026.

Pinel, P., Piazza, M., Le Bihan, D., Dehaene, S., 2004. Distributed and overlapping cerebral representations of number, size, and luminance during comparative judgments. Neuron 41 (6), 983-993.

Podzebenko, K., Egan, G.F., Watson, J.D., 2002. Widespread dorsal stream activation during a parametric mental rotation task, revealed with functional magnetic resonance imaging. NeuroImage 15 (3), 547-558.

Pratt, C.C., 1930. The spatial character of high and low tones. J. Exp. Psychol. 13 (3), 278-285.

Pulvermuller, F., Huss, M., Kherif, F., del Prado Martin, F.M., Hauk, O., Shtyrov, Y., 2006. Motor cortex maps articulatory features of speech sounds. Proc. Natl. Acad. Sci. 103 (20), 7865-7870.

Pulvermuller, F., Fadiga, L., 2010. Active perception: sensorimotor circuits as a cortical basis for language. Nat. Rev. Neurosci. 11 (5), 351-360.

Rauschecker, J.P., 2014. Is there a tape recorder in your head? How the brain stores and retrieves musical melodies. Front. Syst. Neurosci. 8, 149, http://dx.doi.org/10.3389/fnsys.2014.00149.

Reitsma, O., van Gerwen, R., de Munck, M. (Eds.), 2014. Muziek Ervaren: Essays over Muziek en Filosofie. Damon, Budel.

Roffler, S.K., Butler, R.A., 1968. Localization of tonal stimuli in the vertical plane. J. Acoust. Soc. Am. 43 (6), 1260-1266.

Rusconi, E., Kwan, B., Giordano, B.L., Umilta, C., Butterworth, B., 2006. Spatial representation of pitch height: the SMARC effect. Cognition 99 (2), 113-129.

Sakata, H., Taira, M., Kusunoki, M., Murata, A., Tanaka, Y., 1997. The TINS Lecture: the parietal association cortex in depth

perception and visual control of hand action. Trends Neurosci. 20 (8), 350-357.

Schön, D., Anton, J.L., Roth, M., Besson, M., 2002. An fMRI study of music sight-reading. Neuroreport 13 (17), 2285-2289.

Sergent, J., Zuck, E., Terriah, S., MacDonald, B., 1992. Distributed neural network underlying musical sight-reading and keyboard performance. Science 257 (5066), 106-109.

Shadmehr, R., Krakauer, J.W., 2008. A computational

neuroanatomy for motor control. Exp. Brain Res. 185 (3), 359-381.

Shaffer, L.H., 1981. Performances of Chopin, Bach, and Bartok: studies in motor programming. Cogn. Psychol. 13 (3), 326-376.

Stanescu-Cosson, R., Pinel, P., van de Moortele, P.F., Le Bihan, D., Cohen, L., Dehaene, S., 2000. Understanding dissociations in dyscalculia: a brain imaging study of the impact of number size on the cerebral networks for exact and approximate calculation. Brain 123 (11), 2240-2255.

Stewart, L., Henson, R., Kampe, K., Walsh, V., Turner, R., Frith, U., 2003. Brain changes after learning to read and play music. NeuroImage 20 (1), 71-83.

Stewart, L., Walsh, V., Frith, U., 2004. Reading music modifies spatial mapping in pianists. Percept. Psychophys. 66 (2), 183-195.

Stewart, L., Verdonschot, R.G., Nasralla, P., Lanipekun, J., 2013. Action-perception coupling in pianists: learned mappings or spatial musical association of response codes (SMARC) effect?. Q. J. Exp. Psychol. 66 (1), 37-50.

Taylor, J.E.T., Witt, J.K., 2015. Listening to music primes space: pianists, but not novices, simulate heard actions. Psychol. Res. 79 (2), 175-182.

Toiviainen, P., Luck, G., Thompson, M.R., 2010. Embodied meter: hierarchical eigenmodes in music-induced movement. Music Percept. 28 (1), 59-70.

Trimarchi, P.D., Luzzatti, C., 2011. Implicit chord processing and motor representation in pianists. Psychol. Res. 75 (2), 122-128.

Vuust, P., Brattico, E., Seppanen, M., Naatanen, R., Tervaniemi, M., 2012. The sound of music: differentiating musicians using a fast, musical multi-feature mismatch negativity paradigm. Neuropsychologia 50 (7), 1432-1443.

Watkins, K., Paus, T., 2004. Modulation of motor excitability during speech perception: the role of Broca's area. J. Cogn. Neurosci. 16 (6), 978-987.

Wilson, S.M., Molnar-Szakacs, I., Iacoboni, M., 2008. Beyond superior temporal cortex: intersubject correlations in narrative speech comprehension. Cereb. Cortex 18 (1), 230-242.

Wise, S.P., Boussaoud, D., Johnson, P.B., Caminiti, R., 1997.

Premotor and parietal cortex: corticocortical connectivity and combinatorial computations. Ann. Rev. Neurosci. 20 (1), 25-42.

Woody, R.H., Lehmann, A.C., 2010. Student musicians' ear-playing ability as a function of vernacular music experiences. J. Res. Music Educ. 58 (2), 101-115.

Zacks, J.M., Gilliam, F., Ojemann, J.G., 2003. Selective disturbance of mental rotation by cortical stimulation. Neuropsychologia 41 (12), 1659-1667.

Zatorre, R., 2005. Music, the food of neuroscience?. Nature 434 (7031), 312-315.

Zatorre, R.J., Halpern, A.R., Bouffard, M., 2010. Mental reversal of imagined melodies: a role for the posterior parietal cortex. J. Cogn. Neurosci. 22 (4), 775-789.