Behavioral and neurophysiological evidence regarding the influence of oculomotor circuitry on auditory spatial tasks


October 5, 2017 - 3:00pm
Northwest Building, Room 425
About the Speaker
Adrian KC Lee
Speaker Title: 
Associate Professor
Speaker Affiliation: 
University of Washington

Spatial cues play an important role in segregating auditory objects in a complex acoustical scene. Spatial attention is often considered to be supramodal, e.g., crossmodal spatial cues can enhance the perception of stimuli in another modality when presented in the same location. Therefore, it is not surprising to find similarities between auditory and visual spatial attentional networks. An outstanding question, however, is how the supramodal spatial attention network functions if the listener attends instead to non-spatial acoustic features, e.g., pitch? In vision, the coupling between oculomotor circuitry and the attentional network is well studied. Are there behavioral consequences related to this tight oculomotor coupling in the context of auditory spatial tasks? We addressed these questions using three approaches. First, in a series of neuroimaging experiments using combined magneto- and electro-encephalography constrained by anatomical MRI data, we explored how different cortical regions are recruited for auditory spatial and non-spatial attention both during maintenance of attention to a single auditory stream and switching of attention between streams. Second, based on our newly developed sparse-plus-low-rank graphical approach that enables modelling of structured relationships between time series in a big data setting, we are starting work inferring functional connectivity between cortical regions to tease apart how different cortical nodes are coordinated to perform different auditory attentional tasks. Finally, we used psychophysical methods to address whether there are behavioral consequences related to this tight coupling between the oculomotor and attentional networks in the context of auditory spatial tasks. In this talk, I will present a number of findings from our behavioral and neuroimaging experiments in which listeners perform an auditory attentional or audiovisual task. If time permits, I will also discuss our ongoing neuroengineering effort to capitalize on the findings from these neuroimaging experiments to create a next-generation hearing-aid that can selectively amplify the speech signal of interest according to user intent.