Minimalist Parsing

Course Information

Time Mo: 0915-1045 (H1 5.16)
Module Grammatical Approaches to Cognition (04-046-2026)
Instructor Greg Kobele (GWZ H1 5.11)

Course Log

<2019-06-03 Mon>

summary
We noted that in the context of the SMC, derivations are trees (the second arc from a move node is no longer needed). Furthermore, the SpIC and the SMC interact in unexpected ways. We saw how the choice of which 'hole' to expand in the partial derivation tree constituting a particular node in the search space could be made by reference to its position in the derived tree, and that this could be calculated in a simple and monotonic way as the grammatical rules were undone. Finally, we explored how structuring the lexicon as a trie (sharing feature suffixes in feature bundles) allowed for a simple perspective on which operations could be performed next.
readings

<2019-05-27 Mon>

summary

We analyzed our top-down search operators in more detail. Whereas bottom up rules are deterministic, top down ones are not, along the following dimensions:

  1. which feature to undelete
  2. how to divvy up moving elements
  3. splitting the string component

We introduced some constraints on movement (the Specifier Impenetrability Constraint (SpIC) on Merge and the No competition conditio (SMC)), and discussed how they reduced the aforementioned non-determinism.

<2019-05-20 Mon>

summary
We discussed search spaces using the example of the cabbage, goat and wolf problem (trying to ferry this mutually destructing set of entities across the river). We also discussed the search space created by reading our inference rules backwards (from conclusion to premises) in minimalist grammars.
readings

<2019-05-13 Mon>

summary
We movtivated the string list data structure discussed previously, by discussing once again how to construct a derived tree on the basis of the derivation. This time, we imposed a constraint on ourselves: no destructive operations! (Once you put something together, you cannot take it apart again - I'm looking at you, movement.) We saw that this motivated a tree list data structure, each tree of which then could be immediately mapped to a string, yielding the original string list structure.
next time
We will talk about conceptualizing parsing as a search problem. For more see

<2019-05-06 Mon>

summary
We discussed how to construct a string from a derivation structure, without first constructing a derived tree. We saw that we needed to manipulate, not just a single string, but rather a list thereof.
readings
Stabler & Keenan (2003) Structural Similarity Within and Among Languages, section 2

<2019-04-29 Mon>

summary

We formulated parsing as our attempt to model the way we move from form to meaning. As we know how to move from a syntax tree to a meaning (via a compositional semantics), we can, as a first approximation, break the problem of 'full parsing' into

  1. parsing - moving from a string to a syntax tree
  2. interpretation - moving from a syntax tree to a meaning

Legions of semanticists are working on an answer to problem 2, and so we need only address problem 1. This raises the question, however, of what we want the result of parsing to be? There is no consensus even within a single framework (say, of minimalism) of what syntactic structure looks like. There is disagreement as to (among others):

labeling
bare phrase structure, X' theory, arrows, Chomsky labeling, etc
'traces'
traces, copies, multiple dominance
order
trees are ordered, trees are not ordered

It would be sad if we chose a side that turned out to be wrong. Instead, we will aim to reconstruct derivations, on the basis of which we can then compute whatever derived structure is currently in vogue.

readings
review section 16.7 of Stabler (1998)

<2019-04-22 Mon>

No class (Ostermontag)

<2019-04-15 Mon>

summary
We defined movement, and showed how lexical rules could be implemented as lexical items. We introduced a lexical decomposition operation on lexical items by means of which we can express regularities in feature bundles in terms of lexical items.
for next time
'Next time' is not on the <2019-04-22 Mon> (because of easter), but rather the week after (<2019-04-30 Tue>). Here are some notes on lexical decomposition, which also include some remarks about morphology (which we will be largely ignoring in this class).
homework 1

Please do the exercises in the notes on lexical decomposition. In addition, please write up an analysis of the two constructions below (without any fancy morphology, i.e. the underlined features discussed in the notes - just phrasal movement):

  1. S V T
    • John laugh s
  2. S Aux V O
    • John will praise Mary

Please make sure that your analysis also allows for the sentence John will laugh to be derived, and does not also incorrectly allow ungrammatical sentences to be derived. You are more than welcome to work together, but everyone should write up their analyses independently. Please include in your write up the names of the students you worked with (if any).

<2019-04-08 Mon>

summary

We introduced minimalist grammars as a way of lexicalizing (headed) constructions. The basic idea is to encode information on words themselves about what kinds of expressions they require to their right and/or left. We saw that, with expletive constructions (it will rain), we need to use a different lexical item for the will than in other sentences (John will laugh), so as to ensure that rain and it cooccur. Indeed, as we looked at more construction types, we saw that we needed to assign multiple, but often somehow related, types to intuitively the same word.

We introduced (but did not yet formally define) movement as a way of enriching our type system so as to be able to unify the proliferation of homonymy.

for next time
Please continue reading some of the papers on minimalist grammars.

<2019-04-01 Mon>

summary

We defined parsing as

the process of structuring unstructured input

Characterizing a grammar as a description of

which objects have which structures

we see that we can ask the question of what structures the parser assigns to all of the possible inputs; in other words, what the grammar of the parser is.

This raises the question of whether the linguists' grammar coincides with the parser's grammar.

A 'no' answer raises the question of what exactly the linguists' grammar is for: is it able to perform any explanatory role? If so, what?

We will pursue a 'yes' answer in this class. This is the answer which 'takes linguistics seriously.'

The attached colloquium will delve into the philosophical (?) question of how grammar and parser relate.

for next time
We will begin discussing minimalist grammars, the grammar formalism we will use to study the parsing problem. Some useful introductions to this formalism are the following. I would suggest just looking at one of them.
  • section 2.1 of this document is very pedagogical, but makes extensive use of head movement, which we will ignore for the most part.
  • section 1.1 of Thomas Graf's dissertation makes many connections to the literature, focusses heavily of derivation trees
  • Stabler (2010) Computational perspectives on minimalism is harder going, but contextualizes the grammar formalism in the context of the linguistic enterprise, and explores its flexibility by presenting multiple versions thereof
  • Stabler (1998) Remnant movement and complexity is a bare-bones but example heavy introduction, focussing on remnant movement

Author: Greg Kobele

Created: 2019-06-04 Tue 04:56

Validate