Extra Materials:

This episode introduced the basics of compositional semantics. Compositional semantics is the study of how the meanings of the simple expressions that make up a language add up to the meanings of more complex expressions, all the way up to full sentences. And to get to the bottom of how words like verbs and adjectives and nouns work, we introduced some new notation — the lambda calculus — which lets us think about them in term of functions.

Functions are a useful way of thinking about the meanings of words, because they’re set up to expect a certain input and deliver a very specific kind of output. In other words, they let us represent, in a very precise way, the relationships between one sort of thing and another. In the case of intransitive verbs, the functional meaning of the verb relates individuals to truth values; as long as you feed it the right kind of thing, it’ll tell you whether or not the verb truthfully applies to it.

     (1)    “crashed” = λ x . x crashed

And the lambda calculus doesn’t stop there. We can use more complex functions to represent the meanings of transitive verbs and quantifiers, which relate pairs of individuals to truth values and pairs of sets to truth values, respectively.

     (2)    “saved” = λ y . λ x . x saved y

     (3)    “every” = λ F . λ G . F G

But it isn’t enough to specify the meanings of each kind of word; in compositional semantics, we need to spell out the rules that combine these meanings, too. In most cases, it’s obvious what kind of rule we need: Functional Application, which just says that if you have some function and some compatible input, you apply the function to the input (i.e., feed the input to the function). Then, you see what comes out!

Functional Application works most of the time, but not always. Sometimes, our sentences contain non-branching nodes, which is just to say that sometimes the structures in our sentences are very simple. Like, a noun phrase can often consist of just one noun. To explain how these kinds of simple phrases get their meaning, we also need a Non-Branching Nodes rule. This one’s pretty straightforward: it just passes the meaning up from one part of the tree to another. So, if the noun “photograph” has the meaning in (4), so does the noun phrase that contains it, as in (5).

     (4)    [N photograph] = λ x . x is a photograph

     (5)    [NP [N’ [N photograph]]] = λ x . x is a photograph

We also face an interesting problem when it comes to adjectives. When an adjective behaves predicatively, as in (6), it combines with an individual and spits out either true or false, according to the function in (7).

     (6)    The Giant is hungry

     (7)    λ x . x is hungry

But adjectives can be used attributively, too, which means that they can combine with nouns to form complex noun phrases, as in (8).

     (8)    The hungry giant

Since the functions associated with both “hungry” and “giant” can’t combine with each other in the normal way (each is a function looking for an individual, so neither one is compatible with the other), we need something new. To handle these cases, we need Predicate Modification. Predicate Modification takes two functions of the same type and combines them into a third that incorporates both of their meanings by conjoining them.

     (9a)    “hungry” = λ x . x is hungry

     (9b)    “giant” = λ x . x is a giant

     (9c)    “hungry giant” = λ x . x is hungry and x is a giant

Because we glued the meanings of the adjective and the noun together using “and,” the NP “hungry giant” will only spit out true when the individual it’s applied to is both hungry and a giant. Super!

Finally, we need rules that tell us how to interpret sentences involving movement. In the episode, we suggested that (sometimes) words need to move around in a sentence to properly combine with each other. But, how does that affect the meaning? In the sentence in (10), we claimed that the object “most townspeople” has to move to the front because it can’t easily combine with the verb “see.”

     (10)    The Giant saw most townspeople

That kind of movement would result in something like (11), where the moved constituent has left behind a trace.

     (11)    most townspeople The Giant saw <trace>

If we assume that traces are just like variables, and don’t really mean much on their own, we can introduce one last rule to guide us in interpreting these kinds of structures: Predicate Abstraction. It says that when a word moves somewhere and leaves a trace behind, the phrase it moved out of becomes a one-place function. So, “The Giant saw <trace>” acquires the functional meaning in (12).

     (12)    λ x . The Giant saw x

Predicate Abstraction gives the otherwise incomplete phrase “The Giant saw” the same kind of meaning assigned to intransitive verbs. And this meaning is exactly the type of function that the quantified noun phrase “most townspeople” is looking for. When the functions combine with each other, as in (13), they spit out true just as long as The Giant saw more than half the townspeople!

     (13a)    λ x . The Giant saw x = S

     (13b)    λ G . |TG| > ½|T| (S) = |TS| > ½|T|

While there are still some words we haven’t covered, like prepositions and copulas and adverbs, these four basic rules provide us with just about everything we need to account for any simple sentence of English. We’ll be talking more about them in the future, of course, and adding a few more when we need to. But with the lambda calculus in place, we’ve finally gotten our hands on all the tools we need to talk about meaning!

 

Discussion:

So how about it? What do you all think? Let us know below, and we’ll be happy to talk with you about how we can calculate meanings for the sentences we make. There’s a lot of interesting stuff to say, and we want to hear what interests you!

blog comments powered by Disqus

Previous Topic:                                                                                                                                                                                                            Next Topic:

Watch What You Say                                                                                                                                                                                             Coming soon!