Researchers examine how multilingual BERT models encode grammatical features
- February 22, 2021
- Richard Futrell, language science, Tech Xplore, Feb. 22, 2021
"This is a particularly exciting time to be studying computational linguistics," said Richard Futrell, a language scientist at University of California, Irvine and another of the project's senior advisors. "For years, linguists have talked about ideas like 'semantic space," thinking of the meanings of words and phrases as points in some space, but it was all somewhat vague and impressionistic. Now, these theories have been made completely precise: We actually have a model where the meaning of a word is a point in space, and that model really does behave in a way that suggests it understands (some of) human language."
For the full story, please visit https://techxplore.com/news/2021-02-multilingual-bert-encode-grammatical-features.html.