I do behavioural and brain imaging research on how humans perceive and understand language. I study auditory speech, lipreading, text reading and sign language. I'm also interested in "absent" sensory experiences, including inner speech, voluntary imagery and hallucinations.
In my brain imaging work, I use Magnetic Resonance Imaging (MRI) and functional near infrared spectroscopy (fNIRS), with advanced multivariate statistical analysis techniques. I code in R, MATLAB and Python.
I'm interested in these kinds of questions:
In my brain imaging work, I use Magnetic Resonance Imaging (MRI) and functional near infrared spectroscopy (fNIRS), with advanced multivariate statistical analysis techniques. I code in R, MATLAB and Python.
I'm interested in these kinds of questions:
- How do we extract linguistic meaning from variable sensory signals?
- How does the brain deal with competing language signals?
- Do people that are prone to psychosis process sounds and images differently?
- Do signed and spoken languages evoke similar meaning concepts?
- How does lip reading support text reading?
Key publications:
Evans, S., Price, C.J., Diedrichsen, J., Gutierrez-Sigut, E. & MacSweeney, M. 2019. Speech and sign share partially overlapping representations. Current Biology.
Evans, S. and McGettigan, C. 2017. Comprehending auditory speech: previous and potential contributions of functional MRI. Language, Cognition and Neuroscience. 32 (7), pp. 829-846.
Evans, S. and Davis, M.H. 2015. Hierarchical organization of auditory and motor representations in speech perception: Evidence from searchlight similarity analysis. Cerebral Cortex. 25 (12), pp. 4772-4788.
*Evans, S., *Kyong, J.S., Rosen, S., Golestani, N., Warren, J.E., McGettigan, C., Mourão-Miranda, J., Wise, R.J.S. and Scott, S.K. 2014. The Pathways for Intelligible Speech: Multivariate and Univariate Perspectives . Cerebral Cortex. 24 (9).