Passer au contenu de la page principale

Kyle Mahowald

University of Texas at Austin
Participe à 2 sessions

Kyle Mahowald is an Assistant Professor in the Department of Linguistics at the University of Texas at Austin. His research interests include learning about human language from language models, as well as how information-theoretic accounts of human language can explained observed variation within and across languages. Mahowald has published in computational linguistics (e.g., ACL, EMNLP, NAACL), machine learning (e.g., NeurIPS), and cognitive science (e.g., Trends in Cognitive Science, Cognition) venues. He has won an Outstanding Paper Award at EMNLP, as well as the National Science Foundation’s CAREER award. His training includes an M.Phil in linguistics from Oxford, a Ph.D. in cognitive science from MIT, and a postdoc in the Stanford NLP group.
 

Talk

Using Language Models for Linguistics | June 7

Sessions auxquelles Kyle Mahowald participe

Vendredi 7 Juin, 2024

Fuseau horaire: (GMT-05:00) Eastern Time (US & Canada)
9:00 AM
9:00 AM EDT - 5:00 PM EDT | 8 heures

Kyle Mahowald

Conférencier.ère

Kaiyu Yang

Conférencier.ère

Stephen Wolfram

Conférencier.ère
9:00 AM EDT - 10:30 AM EDT | 1 heure 30 minutes
Large Language Models: Applications, Ethics & Risks

Today’s large language models generate coherent, grammatical text. This makes it easy, perhaps too easy, to see them as “thinking machines”, capable of performing tasks that require abstract knowledge and reasoning. I will draw a distinction between formal competence (knowledge of linguistic rules and patterns) and functional competence (understanding and using language in the world). Language models have made huge progress in formal linguistic competence, with important implications for ling...