Kyle Mahowald is an Assistant Professor in the Department of Linguistics at the University of Texas at Austin. His research interests include learning about human language from language models, as well as how information-theoretic accounts of human language can explained observed variation within and across languages. Mahowald has published in computational linguistics (e.g., ACL, EMNLP, NAACL), machine learning (e.g., NeurIPS), and cognitive science (e.g., Trends in Cognitive Science, Cognition) venues. He has won an Outstanding Paper Award at EMNLP, as well as the National Science Foundation’s CAREER award. His training includes an M.Phil in linguistics from Oxford, a Ph.D. in cognitive science from MIT, and a postdoc in the Stanford NLP group.
Talk
Using Language Models for Linguistics | June 7
Sessions in which Kyle Mahowald participates
Friday 7 June, 2024
Today’s large language models generate coherent, grammatical text. This makes it easy, perhaps too easy, to see them as “thinking machines”, capable of performing tasks that require abstract knowledge and reasoning. I will draw a distinction between formal competence (knowledge of linguistic rules and patterns) and functional competence (understanding and using language in the world). Language models have made huge progress in formal linguistic competence, with important implications for ling...