Generation

Course Mo: 0915-1045 (H1 5.16)
Module Sprachproduktion (04-046-2023)
Instructor Greg Kobele (H1 5.11)

Course Log

<2017-11-20 Mon>

  • We talked about PDDL
  • Robot vacuum example Split knowledge into two parts
    1. Planning domain

      General properties of the problem type

      • properties of things
      • what you can do
      (define (domain vacuum)
        (:predicates (clean ?r)
                     (is-in ?r)
                     (adjacent ?r1 ?r2))
        (:action move
                 :parameters (?from ?to)
                 :precondition (and (is-in ?from)
                                    (adjacent ?from ?to))
                 :effect (and (not (is-in ?from))
                              (is-in ?to)))
        (:action suck
                 :parameters (?room)
                 :precondition (is-in ?room)
                 :effect (clean ?room))                 
        )
      
    2. Problem instance

      Description of the problem to solve

      • which things exist
      • what properties they have
      • what we want to achieve
      (define (problem vacuumer)
        (:domain vacuum)
        (:objects left right)
      
        (:init  (is-in left)
                (adjacent left right)
                (adjacent right left))
      
        (:goal  (clean left)
                (clean right))        
      )
      
  • The search space is the set of all possible states of the world, which are represented as sets of ground terms (which are the true ones; all others are false)
  • The edges are given by the actions defined in the problem instance
  • We can search either forwards (from the initial state to goal) or backwards (from goal to initial state)

<2017-11-13 Mon>

  • We discussed search in artificial intelligence
  • We went through the notion of a search space
  • and discussed (uninformed) search strategies
    • depth first
    • breadth first
    • iterative deepening
  • We then talked about informed search
    • best first
    • greedy search
    • admissible heuristics

<2017-11-06 Mon>

  • We discussed the notions of open propositions, and of the distinction between various kinds of knowledge maintained and updated during conversation.
  • We then went through the generation algorithm of Stone and Doran, which treats sentence generation as referring expression generation.
  • Next week we will begin to talk about the next paper:
    READING (required)
    Koller, A. and M. Stone (2007) Sentence generation as a planning problem

<2017-10-30 Mon>

  • We problematized the question of the input to a generation procedure, from the perspective of whether we are encoding too much information about syntax if we use a particular highly structured meaning representation. Hobbs' flat semantics can be thought of as a response to this worry.
  • We discussed the liberating effect of ontological promiscuity, as well as how the rampant postulation of entities makes ample referents available for anaphors of all types.
  • We discussed how to represent quantification, and quantifier scope
  • We stated a simple syntax semantics interface for TAG with Hobbs' flat semantics
  • Finally, we saw how the problem of generation could be thought of in terms of three increasingly restricted problems:
    1. how to generate a grammatical sentence
    2. how to generate a grammatical sentence, which has the desired meaning
    3. how to do the above, but also where the sentence has the appropriate pragmatic features
  • For next week read:
    READING (Required)
    Birner, B. and G. Ward (2009) Information Structure and Syntactic Structure

<2017-10-23 Mon>

  • We discussed wh-movement in TAG, and thought about how various familiar constraints (that-trace, wh-islands) could be captured in terms of restrictions on elementary trees.
  • We turned back to raising, and looked at complement selection alternations contingent on voice (in verbs like make), and on asymmetries in raising between copular constructions and small clause constructions. We saw that these could be described naturally using the extended domain of locality offerend by TAGs elementary trees.
  • Next week we'll actually end up talking about the Hobbs paper.

<2017-10-16 Mon>

  • We reviewed the basic TAG operations (substitution and adjunction), and noted that linguistic generalizations about individual languages, or across languages, come in the form of restrictions on the form of possible elementary trees.
  • We outlined some of Bob Frank's proposals in this regard (below), and saw how particular analyses of control and of raising were forced by these restrictions on elementary trees.
    EPP
    all TP projections within an elementary tree must have specifiers
    θ-criterion (part1)
    all heads in an elementary tree must assign all of their θ-roles in that tree
    θ-criterion (part2)
    all non-terminals at leaves must receive a θ-role within their tree
    Condition on elementary tree minimality (CETM)
    All heads in an elementary tree must belong to the same extended projection
  • For next week read:
    READING (Required)
    Hobbs, J. (1985) Ontological Promiscuity

<2017-10-09 Mon>

Author: Greg Kobele

Created: 2018-10-16 Tue 01:04

Validate