Researchers examine how multilingual BERT models encode grammatical features

Researchers examine how multilingual BERT models encode grammatical features
- February 22, 2021
- Richard Futrell, language science, Tech Xplore, Feb. 22, 2021
"This is a particularly exciting time to be studying computational linguistics," said Richard Futrell, a language scientist at University of California, Irvine and another of the project's senior advisors. "For years, linguists have talked about ideas like 'semantic space," thinking of the meanings of words and phrases as points in some space, but it was all somewhat vague and impressionistic. Now, these theories have been made completely precise: We actually have a model where the meaning of a word is a point in space, and that model really does behave in a way that suggests it understands (some of) human language."
For the full story, please visit https://techxplore.com/news/2021-02-multilingual-bert-encode-grammatical-features.html.
Share on:
Related News Items
- A new model explains difficulty in language comprehension
- The evolutionary trait that may have led to human speech
- Losing parts of our voice box may have helped humans evolve to speak
- When was talking invented? A language scientist explains how this unique feature of human beings may have evolved
- Presentations from UCI faculty at AMP and AWC