Brigham Research Institute Poster Session Site logo-1
Search
Close this search box.

Shasha Li, MD, PhD

Pronouns

She/Her/Hers

Rank

Instructor

Institution

MGH

BWH-MGH Title

Instructor

Department

Radiology

Authors

Kaisu Lankinen, Jyrki Ahveninen, Işıl Uluç, Tori Turpin, Jennifer Fiedler, Qiyuan Tian, Sheraz Khan, Aapo Nummenmaa, Qing-mei Wang, Marziye Eshghi, Jonathan R. Polimeni, Jordan R. Green, Teresa J. Kimberley, Shasha Li*

Role of Articulatory Motor Networks in Perceptual Categorization of Speech Signals: A 7T fMRI Study

My research interests center around developing a precise brain biomarker specific to auditory speech processing in healthy human subjects and auditory communication disorders using a combination of state-of-the-art multimodal neuroimaging approaches and noninvasive brain stimulation techniques. My research work may gain a unique insight into the cortical targeting of articulatory motor areas of auditory speech processing, which could subsequently be translated into advancements in auditory communication rehabilitation approaches. I participate in the Women in Medicine and Science Symposium to show a mom’s strength in pursuing her academic dreams as a working mom in Medicine and Science for her three sons.

Background

The association between brain regions involved in speech production and those that play a role in the perception of acoustic signals is not yet fully understood. In this study, we compared brain activations that occurred during speech production with activations resulting from perceptual discrimination of vocalized sounds using ultra-high field 7 Tesla functional magnetic resonance imaging (fMRI) at 1 mm isotropic voxel resolution. 

Methods

Twenty subjects (7 men, mean age 29 ± 9.0 years) completed speech perception task fMRI and speech production task fMRI at the Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital (MGH). The Institutional Review Board at MGH approved the study and all subjects gave written informed consent. fMRI blood oxygenation level dependent (BOLD) signal data were obtained using a simultaneous multi-slice (SMS) echo planar imaging (EPI) acquisition (TR 2.8 s). In a phoneme discrimination task, subjects were presented with pairs of /ba/ or /da/ syllables (stimulus onset asynchrony 1 s), which were randomized as either identical sounds or sounds that were separated by three intervals along an 8-step continuum between the prototypic /ba/ and /da/ sounds. After presentation of each stimulus pair, the subjects were asked to indicate whether the two syllables they heard were the same or different. Responses were given by pressing one of two buttons using the right-hand index or middle finger. All sounds were presented with MR-compatible, insert-style headphones at a comfortable volume level. In a speech-sound production task, subjects were asked to produce a silent lip-round vowel /u/ in response to the visual cue ”U” or to purse their lips in response to seeing the cue “P”. The cues were presented randomly every 2-4 s.  60 trials of each condition, interspersed with fixation trials, were collected over two runs.

Results

Preliminary univariate fMRI analyses using a parametric modulation modeling approach in general linear model (GLM) indicated that BOLD activations related to phoneme category variability in the /ba/–/da/ discrimination task were strongest in both the left precentral-premotor and inferior frontal cortex areas. Largely the same regions were also activated in the silent vocalization vs. lip pursing task. Conversely, the BOLD signal changes associated with purely acoustic variability of sounds were strongest in the bilateral auditory cortices.

Conclusions

The results support the hypothesis that articulatory-motor networks in the left hemisphere that activate during speech production also activate during perceptual categorization of acoustic signals.

Funding

Research reported in this abstract was supported by the National Institute on Deafness and Other Communication Disorders of the National Institutes of Health under Award Numbers K23DC018022, R01DC017991, R01DC016765, R01DC016915.