Workshop: Grammars, Computation & Cognition
A workshop honouring the scientific legacy of Remko Scha
December 6th workshop at the international conference ‘SMART Animals’ at the University of Amsterdam.
The field of computational linguistics has made much progress in developing models of syntactic and semantic parsing. With current models we can compute with great accuracy and speed the constituency and dependency structure of sentences, predict semantic roles and sentiment, or derive representations that allow us to retrieve and infer facts, summarize text and translate into other languages. However, do these technological advances also yield a better understanding of how language is learned and processed by humans? In this workshop we discuss recent developments in using parsing models for analyzing empirical data from psycholinguistics and brain imaging, developments in rich parsing models that do justice to intricate structural properties of natural languages and unsolved challenges from these domains.
Parsing models for trans-contextfree and morphologically rich languages
Fitting syntactic parsing models to fMRI and MEG data
Neural models of hierarchical structure and artificial grammar learning
Neural transition based parsing for context-sensitive formalisms
Reut Tsarfaty, The Open University of Israel
Andreas van Cranenburgh, Heinrich Heine Universität Düsseldorf
Stefan Frank, Radboud University Nijmegen
Raquel Alhama, BCBL San Sebastian
Willem Zuidema (University of Amsterdam)