While I was a postdoc in the Mayberry and Halgren labs at UCSD, I investigated the brain networks involved in syntax in American Sign Language (ASL), adapting the same experimental paradigms we have used in spoken and written English. Our fMRI results replicate the results of Pallier et al. (2011): a correlation of syntactic complexity in left hemisphere language areas. In addition, we clearly distinguish between systems involved in processing the sensory properties of sign language and higher-level linguistic combinatorics. Our results indicate that sign and speech differ with respect to the sensory systems used to extract language data from the environment, but that syntactic processing involves the same left hemisphere language network. This suggests a revision to the ventral stream of Hickok & Poeppel (2007) to incorporate both speech and sign, with separate sensory processing and shared syntactic/semantic processing. We have also used this experimental paradigm to MEG in order to uncover the temporal profile of this activity informed by parsing models.
THE CRITICAL PERIOD FOR LANGUAGE ACQUISITION
We are using these data as important baselines for investigating the critical (or sensitive) period for language acquisition. Many individuals that are born deaf to hearing parents do not receive any exposure to rich linguistic input until adolescence, when they start to receive instruction in ASL. In these people, acquisition of syntax beyond basic word order is severely impaired (Mayberry, 2002; Cheng et al., in preparation). We are currently scanning these people using simple and canonical sentence constructions that they do understand well using MEG and fMRI in order to determine whether their brains use the same or different brain networks for syntax as native signers of ASL, which will help us understand the fundamental nature of the critical period.