Semantik

Course Information

Time Mi: 1315-1445 (HS 4)
Module Grammatiktheorie (04-006-1006)
Instructor Greg Kobele (GWZ H1 5.11)
Klausur Di: 16. Juli, 11:00-12:30
  Hörsaal 20 M2.031

Eine Skizze einer Probeklausur

Literatur

Course Log

<2019-06-26 Wed>

summary

We worked on using the monotonicity inference rules on sentences like:

  • Jeder Hund, der keinen Greifvogel beißt, bellt

    jeder hund der keinen greifvogel beisst bellt
            - -  
      - - - - - +
    • Greifvogel is positive, so we should be able to replace it with 'Vogel'
    • Hund is negative, so we should be able to replace it with 'Pudel'
  • Höchstens zwei Katzen, die jeder hund, der keinen Greifvogel beißt, fangen will, miauen.

    H 2 K die j H der k Gv beisst (fangen will) miauen
                    - -    
              - - - - - +  
        - - - - - - - - - -
    negative
    Katzen, Greifvogel
    positive
    Hund

We noted that Aristotelian logic (the syllogistic) can be viewed as a monotonicity calculus:

A
All X are Y
E
No X is Y
I
Some X is Y
O
Some X is not Y / Not all X are Y

A and E give monotonicity information:

All X are Y
X ≤ Y
No X is Y
X ≤ ¬Y and ¬X ≤ Y
Haskell
we began implementing semantic theories in Haskell [html | hs]
Literature
Jan Van Eijck's Natural Logic for Natural Language and A monotonicity based proof-theoretic account of syllogistic reasoning

<2019-06-19 Wed>

summary

We discussed negative polarity items, observing that they could only appear in the scope of a downward entailing operator:

  • Every student who has ever been to Minsk became rich
  • ∗Some student who has every been to Minsk became rich
  • No student met any student
  • ∗Every student met any student

(recall that: -every+, some, and -no-)

We also noted that NPIs can occur in non-downward entailing contexts:

  • No boy met every girl who has ever been to Minsk.
  • \(\not\vdash\) No boy met every tall girl who has ever been to Minsk
  • \(\vdash\) No boy met every person who has ever been to Minsk

Thus, the first argument of every in the above sentences is actually upward entailing! This is surprising, because, in isolation, the first argument of every is downward entailing. The polarity of positions in sentences thus depends on the broader context of that position.

We showed that we can compute the polarity of a position by treating polarities as working booleanly: negative polarity is like the complement operation, positive polarity like the identity operation. Or: negative polarity is like multiplying by -1, positive polarity like multiplying by +1. In the above sentences, the first argument of every is positive because it is in the scope of exactly two negative environments: the second argument of no and the first argument of every, and a negative times a negative is a positive.

We can use this to perform a general kind of inference:

  1. if N occurs positively in M, and N ≤ N', then M[N] implies M[N']
  2. if N occurs negatively in M, and N ≤ N', then M[N'] implies M[N]

<2019-06-12 Wed>

summary

We discussed monotonicity:

upward
for all x and y, if x ≤ y then f x ≤ f y
downward
for all x and y, if x ≤ y then f x ≥ f y

Monotonicity only makes sense if both the input and the output domains of a function are ordered. Upward monotonicity is sometimes called respecting the order, and downward monotonicity is reversing the order.

We investigated the monotonicity properties of natural language expressions:

- every + + not every - · exactly n ·
+ some + · most + + at least n +
- no - · few - - at most n -

We also noted that proper names behave as though they were upward monotone. This is puzzling, as we treated them as denoting entities (not functions), and so we were unable to make sense of this. We noted that there were two ways to make the equation below work: \[\textit{john} + \textbf{sleep} = \textbf{sleep}(\textbf{j})\]

  1. \(\textit{john} = \textbf{j}\)
  2. \(\textit{john} = \lambda P.P\textbf{j}\)

This second option has the following properties:

  • it denotes in the domain \([[E\rightarrow T]\rightarrow T]\); i.e. it is a GQ
  • it can therefore be negated, conjoined, and disjoined with other GQs
    1. Bill but not John came to the party
    2. I saw either John or some tall girl
  • it is upward monotone

<2019-06-05 Wed>

summary
We discussed extensions of the basic equivalences regarding compositions of GQs with (post-)complements.
  1. With more GQs
    • ditransitives (3 GQs)
      • Not every witness told every detective two or more lies
      • At least one witness told some detective fewer than two lies
    • tritransitives (?!?) These don't really seem to exist, but using 'light verbs' we can approximate them in English

      • Both bailiffs let three or more defendants pay each judge no bribe
      • Neither bailiff let fewer than three defendants pay no judge a bribe

      With a language with rich valence increasing morphology, like Hungarian or Turkish, we might be able to do this with causatives (except that causatives don't seem to stack so easily…)

  2. With more complex GQs
    • We can build complex GQs from simpler ones via the boolean operations:

      1. \(\neg F \wedge G\)
      2. \(F\neg \vee \neg G \neg\)
      3. \(F \vee G \vee H\)

      These are full fledged GQs, and so we can ask of them how their complements, post-complements and duals can be expressed. As the boolean operations are defined pointwise, we can compute these as follows:

      • \(\neg (F \wedge G) = \neg F \vee \neg G\)
      • \((F \wedge G)\neg = F\neg \wedge G\neg\)
    • We can build complex GQs from simpler ones using the possessor construction:

      • Every boy's pet hamster
      • Most girls' favorite thrash metal band

      Again, we can ask how to express their complements, post-complements and duals. Interestingly, the following is true:

      • \(\neg (F's N) = (\neg F)'s N\)
      • \((F's N)\neg = (F\neg)'s N\)

<2019-05-29 Wed>

summary

We investigated some puzzling yet systematic equivalences:

  • Every student but John read at least as many plays as poems
  • No student but John read fewer plays than poems

We noted that this equivalence stays true regardless of which nouns and verbs we choose - it seems to be based purely on the nature of the determiners involved! We asked what about their meanings made this the case. We defined the operation of 'post-complement' (\(F\neg := \lambda x.F (\neg x)\)), and noted that for each generalized quantifier F, there is a 'square' of related ones: \(F,\ \neg F,\ F\neg,\ \neg F\neg\). These have the property that: \[F\neg \circ \neg G = F \circ G\] In other words 'facing negations' cancel out.

We observed that our equivalences involved pairs of quantifiers which were each other's complement/post-complement/dual.

<2019-05-22 Wed>

summary

We introduced the notion of the complement of an element of a boolean lattice. This only makes sense if the lattice is bounded (i.e. it has a least (0) and a greatest (1) element), in which case we defined a complement of an element x to be an element y such that: \[x \wedge y = 0\hspace{3em}\text{ and }\hspace{3em}x \vee y = 1\] A lattice where every element has a complement is called complemented.

If the lattice is also distributed (i.e. for all u,v,w \(u \wedge (v\vee w) = (u\wedge v)\vee (u \wedge w)\)), then complements are unique!

We proposed that negation in natural language be interpreted as the complement operation in the relevant lattice!

<2019-05-15 Wed>

summary

We showed that, for B any boolean lattice, and A any set (no matter if it is a lattice or not), the set \([A \rightarrow B]\) of all functions from A to B is a boolean lattice, with operations defined pointwise:

  • \(f \le g \text{ iff } \forall x. f(x) \le g(x)\)
  • \(f \wedge g := \lambda x.f(x)\wedge g(x)\)

This makes us predict that sentences of the form [[DP1 and DP2] VP] should be equivalent to sentences of the form [[DP1 VP] and [DP2 VP]]. However, sentences of the form [DP [VP1 and VP2]] should not necessarily be equivalent to [[DP VP1] and [DP VP2]].

<2019-05-08 Wed>

reading
Keenan and Moss textbook, chapter 8 at least up to 8.3 (we will discuss 8.3 next time)
summary
We recalled the definition of partial orders, and introduced the concepts of upper/lower bounds (obere/untere Schranke) of two elements, defining a lattice (Verband) to be a partial order where any two elements always have least upper/greatest lower bounds. We noted that the truth functions of sentence conjunction and disjunction were in fact the same as lub/glb in the truth value lattice, and that intersection/union were lub/glb in the powerset lattice. We asserted that and and or were uniformly (across categories) interpreted as glb and lub operators in the relevant meaning space.

<2019-05-01 Wed> - No Class

<2019-04-24 Wed>

reading
(optional) Keenan's handbook article The Semantics of Determiners
summary
We discussed semantic properties of determiners. We noted that the truth value of a determiner applied to two sets A and B (corresponding to the noun and verb phrase meanings) was determined sometimes by just \(A\cap B\), sometimes by just \(A - B\), and sometimes by both, but never by \(B - A\). This empirical observation/claim is called conservativity, and amounts to the claim that, whenever \(A\cap B = A \cap C\), then \(D A B = D A C\), for any determiner \(D\), and any sets \(A\), \(B\), and \(C\). As a corollary of this, we have that \(D\ A\ B = D\ A\ (A\cap B)\), instances of which can be translated naturally into natural language:
  • most students are happy ≡ most students are students who are happy

<2019-04-17 Wed> - No Class

<2019-04-10 Wed>

reading
Winter's textbook, chapter 2
summary
We observed that our fragment was unable to account for entailment patterns such as involve VP disjunction (from John laughed conclude that John laughed or cried). This was because our notion of situation was too liberal; situations were admitted which assign truth to the first atomic sentence but false to the second. We needed a way of restricting the allowable situations so as to make this impossible. We did this by assuming that the truth or falsity of sentences was not assigned by a situation willy-nilly, but was a consequence of more basic aspects of the situation, such as the meanings of the parts of sentences in the situation.

<2019-04-03 Wed>

reading
Winter's textbook, chapter 2, pp 12-21
summary
We discussed semantics as an empirical science, concerned with the explanation of entailment patterns in natural language. The methodology of fragments was presented as the only reasonable way of making headway in a complex system. We discussed the syntax and semantics of sentential logic, from the perspective of a fragment of natural language concerned with the behaviour of and, or, and not as sentential operators. (Keenan & Moss' textbook gives a similar presentation of SL on pages 179-192.) The notion of a situation relevant to deciding entailment was an assignment of truth values to atomic sentences.

Author: Greg Kobele

Created: 2019-10-14 Mon 11:01

Validate