Description
Organizing complex perceptual input in real time is crucial for our ability to interact with the world around us, and information received in the auditory modality in particular is central to many fundamental aspects of human behavior. In the Language and Music Perception Lab, we take an interdisciplinary approach to studying auditory perception. Our research encompasses cross-modal, music, and speech perception using a combination of behavioral, cognitive neuroscience, and computational modeling approaches. Our research focuses on the importance of interactions, including the interaction of bottom-up and top-down processing and interactions within and across modalities. Our current studies employ behavioral testing, qualitative data collection, and electroencephalogram (EEG) neuroimaging to further our understanding of auditory perception and higher cognitive processes
Language and Music Perception Lab Projects
Organizing complex perceptual input in real time is crucial for our ability to interact with the world around us, and information received in the auditory modality in particular is central to many fundamental aspects of human behavior. In the Language and Music Perception Lab, we take an interdisciplinary approach to studying auditory perception. Our research encompasses cross-modal, music, and speech perception using a combination of behavioral, cognitive neuroscience, and computational modeling approaches. Our research focuses on the importance of interactions, including the interaction of bottom-up and top-down processing and interactions within and across modalities. Our current studies employ behavioral testing, qualitative data collection, and electroencephalogram (EEG) neuroimaging to further our understanding of auditory perception and higher cognitive processes
Comments
Faculty Mentor: Laura Getz