Researchers examine how multilingual BERT models encode grammatical features

Researchers examine how multilingual BERT models encode grammatical features
- February 22, 2021
- Richard Futrell, language science, Tech Xplore, Feb. 22, 2021
-----
"This is a particularly exciting time to be studying computational linguistics," said Richard Futrell, a language scientist at University of California, Irvine and another of the project's senior advisors. "For years, linguists have talked about ideas like 'semantic space," thinking of the meanings of words and phrases as points in some space, but it was all somewhat vague and impressionistic. Now, these theories have been made completely precise: We actually have a model where the meaning of a word is a point in space, and that model really does behave in a way that suggests it understands (some of) human language."
For the full story, please visit https://techxplore.com/news/2021-02-multilingual-bert-encode-grammatical-features.html.
-----
Would you like to get more involved with the social sciences? Email us at communications@socsci.uci.edu to connect.
Share on:
Related News Items
- Careet RightBridging the communication gap between humans and AI
- Careet RightNew UC Irvine Center for Language, Intelligence, and Computation to serve as hub for scientific study of language
- Careet RightFutrell receives grant for large language model study
- Careet RightCan AI models show us how people learn? Impossible languages point a way.
- Careet RightWhat is language, and how human constraints shape it?

