Attributes of Language Which Ensure Learnability
The Center for Language Science and Department of Cognitive Sciences present
"Attributes of Language Which Ensure Learnability"
with Sean A. Fulop, Professor, Department of Linguistics; Program in Cognitive Science, Fresno State University
Wednesday October 16, 2013
Social & Behavioral Sciences Gateway (SBSG), Room 1517
Natural language is learnable. This means that children can acquire the language of their surroundings using an (apparently) innate inductive process, given little more than natural speech in everyday settings. Yet from a formal perspective, most classes of languages rich enough to model human languages would not be learnable in this way—indeed it is challenging to describe a language class that is both learnable and adequate. Investigators have been confronting this conundrum at least since Gold (1967) showed that standard language classes such as context-free would not be learnable unless limited in some fashion.
In the presentation, Fulop will highlight some (conjectured) syntactic attributes of natural language which limit the class sufficiently to enable learning to succeed without extravagant helpings of Universal Grammar. In previous work (Fulop 2010, 2011), he outlined a discovery algorithm for formal languages which learns the syntactic categories by semantic bootstrapping and distributional analysis (following Pinker 1984). A rich class of languages was shown to be learnable from a finite set of good examples—so long as the learner assumes the target language possesses certain properties, including a fundamental assumption of semantic compositionality. These properties and their utility in natural languages will be the main topic of discussion.
For further information, please contact Clara Schultheiss, email@example.com.