Computerlinguistik

Time Mo: 1115-1245 (HS 20)
Module Computerlinguistik (04-006-1006)
Instructor Greg Kobele (GWZ H1 5.11)
Tutor Max Polter
Tutorium Fr: 0915-1045 (S 104)

Announcements

Tutorium

Course Log

<2019-02-06 Wed>

We continued to discuss Haskell implementation. We also discussed the linguistic relevance of computational linguistics.

a handout (with compiling code) | examples (and code)

<2019-02-04 Mon>

We discussed Haskell implementation issues.

a (slightly revised) handout

<2019-01-30 Wed>

We dug deeper into differences between models regarding their ability to derive languages with crossing dependencies, and identified various empirical data that would decide between our models.

Question

do copies have a simpler internal syntax than do non-copies? (Kanazawa & Salvati, 2010)

i.e. can we enforce nested dependencies inside of copies? if so, the MTT model cannot generate such copies. if not, this is a stunning verification of a deep prediction of the MTT model.

Question

can you have recursive copying? (Kobele, 2006)

i.e. can you have copies contained in copies (contained in copies)? if so, the MBUTT model cannot generate such copies. If not, this is a verification of a deep prediction of the MBUTT/MTT models.

Question

can you have non-exact copying?

i.e. can you copy structure, without copying words? if so, the non-linear transducer model cannot generate such copies. If not, this is a verification of a deep prediction of the MBUTT/MTT models.

<2019-01-28 Mon>

We spoke about the generative capacity of these various kinds of tree transducers. In particular, we discussed crossing dependencies in natural language (Swiss German, Mandarin Chinese), and identified the copy language as a simplified model of a pattern that requires some way of enforcing crossing dependencies to derive. We discussed how to derive the copy language using

  • mbutts (tuples of trees)
  • mtts (tree contexts)
  • minimalist grammars (movement)
  • non-linear tree transducers (copying)

<2019-01-23 Wed>

We discussed interpreting input subtrees as tree contexts to model the yield function. We considered how these transducer models were different instantiations of the same formalism.

a (preliminary) handout

<2019-01-21 Mon>

We discussed how to model movement using tuples of trees as the interpretations of input subtrees. We introduced a simple form of Minimalist grammar, and showed how to transduce from derivation to derived trees.

background on minimalist grammars: Stabler (1997) Derivational Minimalism

<2019-01-16 Wed>

We discussed tree homomorphisms, and tree transducers. We introduced tree contexts as the translations of nodes.

<2019-01-14 Mon>

We discussed top down and bottom up tree automata.

an exceptional reference: TATA

<2019-01-09 Wed>

No Class

<2019-01-07 Mon>

No Class

<2018-12-19 Wed>

We described various subclasses of functions computable in our models

  • ISL
    • each state is just a letter, representing the last input read
      • all arcs reading a letter go to the appropriately named state
    • generalization to last k inputs read
      • each state is a sequence of (up to) k letters
      • in state (a,b,c) reading d go to (b,c,d)
  • L/R-OSL
    • each state is a letter, representing the last output written
    • can generalize to last k outputs written
  • weakly deterministic
    • composition of a left with a right sequential transducer with the same alphabets
    • definition:
      • no additional letters can use additional letters to encode contextual information
      • no length increase can encode additional letters using multiple old letters!

Sour grapes harmony is a supposedly pathological phenomenon which can be described in OT, but is (presumably) not weakly deterministic.

<2018-12-17 Mon>

We discussed transducer composition using an example from Lomongo. We discussed progressive and regressive processes in natural language, noting that regressive harmony processes could not be described using our model, which operates from left-to-right. We showed that it could be described by our model, were it to operate from right-to-left.

<2018-12-12 Wed>

Edit <2019-01-17 Thu>
Here are some pointers to the literature on sequential transducers:
Edit <2019-01-14 Mon>
I have modified the DFST module, and the model language project, so as to implement a distinction between left-to-right and right-to-left transductions at the type level. The only effect this has for you is that you need to create transducers using the functions mkForwardTransducer and mkBackwardTransducer.

We discussed how to formalize transducers. Here is another model language project, this time using transducers.

Dfst module | code

<2018-12-10 Mon>

Special class session run by Max Polter

<2018-12-05 Wed>

We motivated studying processes in various domains of linguistics. We looked at an example from the Bantu language Lomongo, where various processes (nonlow vowels gliding before vowels, the first vowel in a sequence of two vowels deletes, and inter-vocalic voiced consonant deletion) interact to derive the surface forms.

We discussed adding output to our finite automata, and did some exercises in class.

<2018-12-03 Mon>

No Class (Dies Akademicus)

<2018-11-28 Wed>

<2018-11-26 Mon>

We discussed using the Maybe data type to represent partiality, and defined a function which completed a partial DFA.

We discussed the cross-product construction.

  • Homework

    Try to implement functions for

    • completion of a machine
    • the intersection and union of machines

<2018-11-21 Wed>

No Class (Vorlesungsfrei)

<2018-11-19 Mon>

We showed how to implement finite state automata in Haskell.

notes | code

  • Homework
    1. look at the code/notes

<2018-11-14 Wed>

We introduced finite state automata formally.

  • Homework
    1. Read Programming in Haskell: Chapter 5 up through section 5.4
    2. do exercises 1,2,4,5 in section 5.7
    3. Read chapter 1 of the Sipser book (chapter 1), up to page 44

<2018-11-12 Mon>

We introduced finite state automata informally with examples, as machines which listen to one word at a time, possibly changing their memory state in response.

<2018-11-07 Wed>

We discussed how Haskell determines whether expressions are well-typed (and if so, what type they have). We also explored user-defined types, both using the type keyword for defining abbreviations for types we can already express, and using the data keyword, for new data types.

  • Homework
    1. Read Programming in Haskell: Chapter 4
    2. do exercises 1,2,3,4,7 in section 4.8

<2018-11-05 Mon>

Today we discussed polymorphism, and type class constraints.

  • Homework
    1. Read Programming in Haskell: finish chapter 3
    2. do exercises 3, 4 and 5 in section 3.11

<2018-10-31 Wed>

No Class (Reformationstag)

<2018-10-29 Mon>

We talked about types in Haskell (Int, Char, String) as well as certain type constructors (->, ,). We defined functions over trees in a recursive manner.

<2018-10-24 Wed>

Today we discussed function composition, and observed that functions over a set had a monoidal structure (with respect to composition and the identity function). We also explored two formalizations of trees.

notes

  • Homework
    1. Define the yield function, mapping a tree to the sequence of its leaves, read from left to right.
    2. Read Programming in Haskell: Chapter 3, up through 3.7 (page 39)
    3. do the excercises 1 and 2 in section 3.11

<2018-10-22 Mon>

Today we went over the homework, and explored an alternative, inductive, definition of strings.

notes

  • Homework
    1. Show how to define the length function in terms of allOnes (the function mapping each element in a list to 1) and summe (the function summing all of the numbers in a list)
    2. Read Programming in Haskell: Chapter 2.

<2018-10-17 Wed>

Today we went over familiar set theoretic notions, before presenting formal descriptions of strings.

notes

  • Homework
    1. show how to represent the string "0101" (over the alphabet 0 and 1)
    2. concatenate strings "000" and "000"
    3. is this notion of concatenation associative?
    4. Read Learn you a Haskell: Chapter 1, and up to an intro to lists in Chapter 2

<2018-10-15 Mon>

  • Homework
    1. Install Haskell, and start ghci. You should see something like the following.

      GHCi, version 8.4.2: http://www.haskell.org/ghc/  :? for help
      Prelude>
      

      press :q to exit.

    2. Read the excerpt from the Sipser book (chapter 0), and do exercises 0.1 to 0.6. Most important are sections 0.1 to 0.2 (i.e. up to page 17).

Author: Greg Kobele

Created: 2019-04-02 Tue 08:55

Validate