Semantic attachment
Semantic attachment is the process of creating the semantics of a sentence by attaching pieces of semantics to the syntax tree. The problem lies in the composition.
Introduction
For some applications a semantic representation of a sentence is needed. Such a representation consists of a set of predications: relations between objects and events.
For example: the simple sentence
John walks
can be represented by this set of predications:
∃e1, o1 isa(e1, Walk) ∧ subject(e1, o1) ∧ name(o1, "John")
From here on we will leave out the existential quantors.
The problem is to get from the syntax tree
S
+--NP
| +-- proper noun: John
|
+--VP
+-- verb: walks
to the semantic representation.
I know of three ways that this is done.
Semantic specialists
[Winograd-1972] applies a set of semantic specialists to the syntax tree. Their job is to create complex semantic structures that represent the sentence's semantics. The specialist for the NP node uses the node of the proper noun below it to create its semantics. It may also inspect other nodes of the tree and check if their meaning is compatible. Specialists may use domain-specific procedures to provide the correct meaning. Their form is procedural rather than declarative and this makes them hard to extend.
Lambda calculus
Mosts texts I read use lambda calculus to compose meaning. For example, see [Jurafsky and Martin-2000]. I will work out the sentence above:
S {Sem = VP.Sem(NP.sem) = λe isa(e, Walk) ∧ subject(e, <name(y, "John")>)}
+--NP {Sem = propernoun.Sem = λy name(y, "John")}
| +-- propernoun: John {Sem = λy name(y, "John")}
|
+--VP {Sem = verb.Sem = λx,e isa(e, Walk) ∧ subject(e, x)}
+-- verb: walks {Sem = λx,e isa(e, Walk) ∧ subject(e, x)}
The verb node gets its semantics from the dictionary entry of walk.
The proper noun get its semantics from a procedure that creates an predication, name(y, "John"), for the word john.
Both the VP and the NP node inherit their semantics from their child node.
Only the S node performs some actual composition by applying the semantics of VP, like a function, to the semantics of the NP.
S.Sem (the semantics of S) applies the function VP.Sem to the argument NP.Sem. This results in a new function with only one unbound argument.
In this technique each node exports a lambda function that has zero or more (unbound) arguments. This lambda function is then used as an argument for the lambda function of the node above. The order of the exposed arguments is important.
It is a simple technique for the given example. But as soon as more complex sentences are tried, the technique requires special constructs and becomes less intuitive quickly ([Jurafsky and Martin-2000], chapter 15.2). This is mainly due to the rigid handling of variables.
Feature unification
The Core Language Engine [Alshawi-1992], one of the most advanced NLP systems I know, gave up on lambda calculus for the most part, because of its unnecessary complexity. In stead, they used feature unification, the same technique used to ensure that tense and number match in syntactic interpretation. Evidently this technique is capable of supporting the semantification of complex sentences.
However, their system requires still a quite complex feature structure in each dictionary entry, and seems to me to be hard to use for the uninitiated user.
My alternative
To describe the technique I came up with, I will use the example above:
S {Sem = VP.sem ∧
| NP.sem ∧
| subject(S.event, S.subject) ∧
| S.event = VP.event ∧
| S.subject = NP.object
| = isa(S.event, Walk) ∧ subject(S.event, S.subject) ∧ name(S.subject, "John")}
|
+--NP {Sem = propernoun.Sem ∧
| | NP.object = propernoun.object
| | = name(NP.object, "John")}
| |
| +-- propernoun: John {Sem = name(propernoun.object, "John")}
|
+--VP {Sem = verb.Sem ∧
| VP.event = verb.event
| = isa(VP.event, Walk)}
|
+-- verb: walks {Sem = isa(verb.event, Walk)}
I will explain the new ideas in this technique:
- Each syntax tree node has a set of properties. The verb and the VP node have the property event. The noun and the NP node have the property object. The S node has the properties subject and event. Other nodes have other properties and the named nodes have other properties as well that are not used in this example.
This is what sets the technique apart from the others. In stead of using variables that are in themselves meaningless, I use properties: objects with a predefined meaning. The property VP.event, for example designates the event object at the NP node. Because its meaning is built-in, the property may be passed to the semantic attachment above without complicated structures. The property may be seen as a role. VP.event is the object that plays the role of event within the context of the VP. - When a property is propagated above, its role may change. We see this here when NP.Sem is passed to the S node. The property NP.object is changed into S.subject, via the assignment S.subject = NP.object. This says: the object that played the role of object in the NP node, now plays the role of subject in the S node.
- A semantic attachment may contain three types of constructs:
- Copy child semantics (i.e. S.Sem = VP.Sem)
- Introduce new predications (i.e. subject(S.event, S.subject))
- Map child properties to self properties (i.e. S.subject = NP.object)
- Composition is mostly done by applying conjunction. Conjunction is simpler than functional application. Just add predications together with ∧, as in S.Sem = VP.sem ∧ NP.sem.
I am just starting to use this technique. When I have worked with it for a while I will report the progress. Only then will I be able to give more examples.
References
[Winograd-1972] Understanding Natural Language - Terry Winograd
[Jurafsky and Martin-2000] Speech and Language Processing - Daniel Jurafsky and James H. Martin
[Alshawi-1992] The Core Language Engine - Hiyan Alshawi, ed.
- Labels
- nlp
Archief > 2012
december
- 27-12-2012 27-12-2012 10:52 - Semantic attachment
- 26-12-2012 26-12-2012 11:27 - Semantic representation
Reacties op 'Semantic attachment'
cheap jerseys free shipping
replica louis vuitton shoes
louis vuitton outlet