The interplay between musical training and speech perception continues to intrigue researchers in the areas of language and music alike. Historically, language function has been attributed to brain regions localized predominately in left hemisphere, whereas music has been attributed to right hemisphere dominant regions. Recent studies demonstrating neural overlap for processing speech and music, and enhanced speech perception and production in musicians suggest that these regions may be inextricably intertwined. The extent of neural overlap between music and speech remains hotly debated, with surprisingly little empirical research exploring specific neural homo-logs and analogs. Moreover, despite recognition that shared processes likely exist throughout development and depend upon an individual's acoustic experiences, even less research exists on how overlapping neural structures for music and language are affected by developmental trajectories. Nonetheless, the field is well poised to address key empirical questions, in part because of the recent development of new theories that address the neural and developmental interaction between music and language processing in conjunction with the broad availability of sophisticated tools for quantifying brain activity and dynamics. To understand the overlap of neural structures for language and music processing, research is needed to identify those specific functions of each that influence the other, with areas for enhanced perception of pitch and onset time having already been targeted. Research is also needed to identify the extent to which this overlap is developed in infancy or early childhood and the process by which it affects neural reorganization, plasticity, and trainability in adulthood. For this research topic, we would like to further explore the relationship between language and music in the brain from two perspectives: 1) understanding the nature of shared neural and cognitive processing for music and language and 2) understanding the developmental trajectory of these neural systems and how they are influenced by experience. We seek to gather technically diverse original research articles that present new empirical findings relevant to understanding: 1. When, in the brain, acoustic information becomes processed specifically as language or music. The shared and independent neural structures for processing music and language. 3. How acoustic experiences such as musical training influence overlap of neural structures for language and music. 4. How the overlap of processing regions changes over time due to experiences at any developmental stage.