Cognitive Neuroscience Learning Resources
If you’d like to learn more about cognitive neuroscience, please visit this page, where you can find a list of recommended websites, podcasts, books, and social media accounts to follow. This list will be updated regularly. The Brain Space Initiative is a growing community of people interested in neuroimaging, and is open to anyone who is interested!
Word Learning in Children
To download the preprocessed data and analysis scripts from the Malins et al. (2020) paper in Developmental Science, please visit the study page on the Open Science Framework here.
Individual Differences in Trial-by-Trial Neural Activation Variability
If you’d like to download the AFNI and R scripts used to calculate trial-by-trial variability in fMRI activation in the Malins et al. (2018) paper in The Journal of Neuroscience, you can find the relevant files on the Open Science Framework here. This paper had the honor of being recommended on F1000 Prime, classified as: Good for Teaching, Interesting Hypothesis, New Finding, and Technical Advance.
The TRACE-T Computational Model of Mandarin Chinese Spoken Word Processing
When Shuai and Malins (2017) published the TRACE-T model in Behavior Research Methods, it received some attention as featured content on the Psychonomic Society blog. To read the full article, click here.
If you’d like to see a video demo of the TRACE-T model, click here.
The model makes use of the PatPho system for representing Mandarin Chinese phonology as well as the jTRACE platform. If you’d like to download the model to run simulations yourself, you find the relevant files on the Open Science Framework here.
Examining Bilingual Lexical Access Using the Visual World Paradigm
To access the data and scripts from the Wang, Wang, and Malins (2017) study in Cognition examining lexical access in Mandarin-English bilinguals using the visual world paradigm, you can find the relevant files on the Open Science Framework here. Included in this set of files are R scripts to perform growth curve analyses of eyetracking data.