What neural networks can teach us about how we learn language
Mon statut pour la session
Quoi:
Talk
Partie de:
Quand:
9:00 AM, Mardi 4 Juin 2024 EDT
(1 heure 30 minutes)
Thème:
Large Language Models & Learning
How can modern neural networks like large language models be useful to the field of language acquisition, and more broadly cognitive science, if they are not a priori designed to be cognitive models? As developments towards natural language understanding and generation have improved leaps and bounds, with models like GPT-4, the question of how they can inform our understanding of human language acquisition has re-emerged. This talk will try to address how AI models as objects of study can indeed be useful tools for understanding how humans learn language. It will present three approaches for studying human learning behaviour using different types of neural networks and experimental designs, each illustrated through a specific case study.
Understanding how humans learn is an important problem for cognitive science and a window into how our minds work. Additionally, human learning is in many ways the most efficient and effective algorithm there is for learning language; understanding how humans learn can help us design better AI models in the future.
Understanding how humans learn is an important problem for cognitive science and a window into how our minds work. Additionally, human learning is in many ways the most efficient and effective algorithm there is for learning language; understanding how humans learn can help us design better AI models in the future.
References
Portelance, E. & Jasbi, M.. (2023). The roles of neural networks in language acquisition. PsyArXiv:b6978. (Manuscript under review).
Portelance, E., Duan, Y., Frank, M.C., & Lupyan, G. (2023). Predicting age of acquisition for children's early vocabulary in five languages using language model surprisal. Cognitive Science.
Portelance, E., M. C. Frank, D. Jurafsky, A. Sordoni, R. Laroche. (2021). The Emergence of the Shape Bias Results from Communicative Efficiency. Proceedings of the 25th Conference on Computational Natural Language Learning (CoNLL).