Meaning

One crucial aspect of constructional approaches has been ostensibly lacking from the discussion so far: meaning. Constructional theories hold that the grammatical building blocks are pairings of some signifying form with a signified meaning. Although much work in DOP has been done on grammatical form per se, the model is not incompatible with this approach to grammar. In fact, the model has no restriction on the representations it processes, as long as they are well-formed according to some formal criterion. This follows from the claim that DOP is a domain-general learner; as such it has to be able to detect structure regardless of the topology or content of the structure.

DOP has a long history in trying to accommodate meaningful representations. (Bonnema, Bod, and Scha 1997) can be seen as a first attempt. In this model, the syntactic representations on the tree’s nodes were enriched with lambda-calculus logical formulae. Later developments were the integration of DOP and Lexical-Functional Grammar (LFG-DOP, Bod and Kaplan 1998) and Head-Driven Phrase Structure Grammar (HPSG-DOP, Arnold and Linardaki 2007). Building on the insights of these models, we propose an unsupervised variant of Data-Oriented Parsing that incorporates meaning. Because this is the first exploration of an unsupervised learning mechanism to meaning-enriched structures, we chose not to use the rich representations of LFG or HPSG, but rather take a very simple and limited formalism to illustrate the priniciple. We will show how it functions, how a learner may derive productive patterns with it, and what its limitations are.

 
Source
< Prev   CONTENTS   Source   Next >