Overlapping Semantic Representations of Sign and Speech in Novice Sign Language Learners

Abstract

The presence of semantic information in multivariate patterns of neural activity has been explored as a method of measuring knowledge and learning. Using fMRI, we investigated whether novice learners of American Sign Language (ASL) showed overlapping representations of semantic categories for words presented in a well-known (English) or newly learned (ASL) language. We find evidence of neural patterns that were partially shared between sign and speech in novice participants. This result provides evidence for the influence of even brief learning on neural representations in cross-modality language processing.

Publication
Proceedings of the 44th Annual Conference of the Cognitive Science Society (CogSci), 2022.
Xia Zhou
Xia Zhou
Associate Professor

My research interests lie in mobile computing.

Related