Minimalist Parsing
Course Information
Time | Mo: 0915-1045 (H1 5.16) |
Module | Grammatical Approaches to Cognition (04-046-2026) |
Instructor | Greg Kobele (GWZ H1 5.11) |
Course Log
- summary
- We noted that in the context of the SMC, derivations are trees (the second arc from a move node is no longer needed). Furthermore, the SpIC and the SMC interact in unexpected ways. We saw how the choice of which 'hole' to expand in the partial derivation tree constituting a particular node in the search space could be made by reference to its position in the derived tree, and that this could be calculated in a simple and monotonic way as the grammatical rules were undone. Finally, we explored how structuring the lexicon as a trie (sharing feature suffixes in feature bundles) allowed for a simple perspective on which operations could be performed next.
- readings
- summary
We analyzed our top-down search operators in more detail. Whereas bottom up rules are deterministic, top down ones are not, along the following dimensions:
- which feature to undelete
- how to divvy up moving elements
- splitting the string component
We introduced some constraints on movement (the Specifier Impenetrability Constraint (SpIC) on Merge and the No competition conditio (SMC)), and discussed how they reduced the aforementioned non-determinism.
- summary
- We discussed search spaces using the example of the cabbage, goat and wolf problem (trying to ferry this mutually destructing set of entities across the river). We also discussed the search space created by reading our inference rules backwards (from conclusion to premises) in minimalist grammars.
- readings
- Harkema (2001) Parsing Minimalist Languages (chapter 5)
- summary
- We movtivated the string list data structure discussed previously, by discussing once again how to construct a derived tree on the basis of the derivation. This time, we imposed a constraint on ourselves: no destructive operations! (Once you put something together, you cannot take it apart again - I'm looking at you, movement.) We saw that this motivated a tree list data structure, each tree of which then could be immediately mapped to a string, yielding the original string list structure.
- next time
- We will talk about conceptualizing parsing as a search problem. For more see
- David Poole & Alan Mackworth Artificial Intelligence, chapter 3
- Richard Korf (1996) Artificial Intelligence Search Algorithms
- summary
- We discussed how to construct a string from a derivation structure, without first constructing a derived tree. We saw that we needed to manipulate, not just a single string, but rather a list thereof.
- readings
- Stabler & Keenan (2003) Structural Similarity Within and Among Languages, section 2
- summary
We formulated parsing as our attempt to model the way we move from form to meaning. As we know how to move from a syntax tree to a meaning (via a compositional semantics), we can, as a first approximation, break the problem of 'full parsing' into
- parsing - moving from a string to a syntax tree
- interpretation - moving from a syntax tree to a meaning
Legions of semanticists are working on an answer to problem 2, and so we need only address problem 1. This raises the question, however, of what we want the result of parsing to be? There is no consensus even within a single framework (say, of minimalism) of what syntactic structure looks like. There is disagreement as to (among others):
- labeling
- bare phrase structure, X' theory, arrows, Chomsky labeling, etc
- 'traces'
- traces, copies, multiple dominance
- order
- trees are ordered, trees are not ordered
It would be sad if we chose a side that turned out to be wrong. Instead, we will aim to reconstruct derivations, on the basis of which we can then compute whatever derived structure is currently in vogue.
- readings
- review section 16.7 of Stabler (1998)
No class (Ostermontag)
- summary
- We defined movement, and showed how lexical rules could be implemented as lexical items. We introduced a lexical decomposition operation on lexical items by means of which we can express regularities in feature bundles in terms of lexical items.
- for next time
- 'Next time' is not on the Here are some notes on lexical decomposition, which also include some remarks about morphology (which we will be largely ignoring in this class). (because of easter), but rather the week after ( ).
- homework 1
Please do the exercises in the notes on lexical decomposition. In addition, please write up an analysis of the two constructions below (without any fancy morphology, i.e. the underlined features discussed in the notes - just phrasal movement):
- S V T
- John laugh s
- S Aux V O
- John will praise Mary
Please make sure that your analysis also allows for the sentence John will laugh to be derived, and does not also incorrectly allow ungrammatical sentences to be derived. You are more than welcome to work together, but everyone should write up their analyses independently. Please include in your write up the names of the students you worked with (if any).
- S V T
- summary
We introduced minimalist grammars as a way of lexicalizing (headed) constructions. The basic idea is to encode information on words themselves about what kinds of expressions they require to their right and/or left. We saw that, with expletive constructions (it will rain), we need to use a different lexical item for the will than in other sentences (John will laugh), so as to ensure that rain and it cooccur. Indeed, as we looked at more construction types, we saw that we needed to assign multiple, but often somehow related, types to intuitively the same word.
We introduced (but did not yet formally define) movement as a way of enriching our type system so as to be able to unify the proliferation of homonymy.
- for next time
- Please continue reading some of the papers on minimalist grammars.
- summary
We defined parsing as
the process of structuring unstructured input
Characterizing a grammar as a description of
which objects have which structures
we see that we can ask the question of what structures the parser assigns to all of the possible inputs; in other words, what the grammar of the parser is.
This raises the question of whether the linguists' grammar coincides with the parser's grammar.
A 'no' answer raises the question of what exactly the linguists' grammar is for: is it able to perform any explanatory role? If so, what?
We will pursue a 'yes' answer in this class. This is the answer which 'takes linguistics seriously.'
The attached colloquium will delve into the philosophical (?) question of how grammar and parser relate.
- for next time
- We will begin discussing minimalist grammars, the grammar formalism we will use to study the parsing problem. Some useful introductions to this formalism are the following. I would suggest just looking at one of them.
- section 2.1 of this document is very pedagogical, but makes extensive use of head movement, which we will ignore for the most part.
- section 1.1 of Thomas Graf's dissertation makes many connections to the literature, focusses heavily of derivation trees
- Stabler (2010) Computational perspectives on minimalism is harder going, but contextualizes the grammar formalism in the context of the linguistic enterprise, and explores its flexibility by presenting multiple versions thereof
- Stabler (1998) Remnant movement and complexity is a bare-bones but example heavy introduction, focussing on remnant movement