Skip links
Main content

Semantic attachment

donderdag 27 december 2012 10:52

Semantic attachment is the process of creating the semantics of a sentence by attaching pieces of semantics to the syntax tree. The problem lies in the composition.

Introduction

For some applications a semantic representation of a sentence is needed. Such a representation consists of a set of predications: relations between objects and events.

For example: the simple sentence

John walks

can be represented by this set of predications:

∃e1, o1 isa(e1, Walk)  subject(e1, o1)  name(o1, "John")

From here on we will leave out the existential quantors.

The problem is to get from the syntax tree

S
+--NP
|    +-- proper noun: John
|
+--VP
     +-- verb: walks

to the semantic representation.

I know of three ways that this is done.

Semantic specialists

[Winograd-1972] applies a set of semantic specialists to the syntax tree. Their job is to create complex semantic structures that represent the sentence's semantics. The specialist for the NP node uses the node of the proper noun below it to create its semantics. It may also inspect other nodes of the tree and check if their meaning is compatible. Specialists may use domain-specific procedures to provide the correct meaning. Their form is procedural rather than declarative and this makes them hard to extend.

Lambda calculus

Mosts texts I read use lambda calculus to compose meaning. For example, see [Jurafsky and Martin-2000]. I will work out the sentence above:

S {Sem = VP.Sem(NP.sem) = λe isa(e, Walk) ∧ subject(e, <name(y, "John")>)}
+--NP {Sem = propernoun.Sem = λy name(y, "John")}
|    +-- propernoun: John {Sem = λy name(y, "John")}
|
+--VP {Sem = verb.Sem = λx,e isa(e, Walk) ∧ subject(e, x)}
     +-- verb: walks {Sem = λx,e isa(e, Walk) ∧ subject(e, x)}

The verb node gets its semantics from the dictionary entry of walk.
The proper noun get its semantics from a procedure that creates an predication, name(y, "John"), for the word john.
Both the VP and the NP node inherit their semantics from their child node.
Only the S node performs some actual composition by applying the semantics of VP, like a function, to the semantics of the NP.
S.Sem (the semantics of S) applies the function VP.Sem to the argument NP.Sem. This results in a new function with only one unbound argument.

In this technique each node exports a lambda function that has zero or more (unbound) arguments. This lambda function is then used as an argument for the lambda function of the node above. The order of the exposed arguments is important.

It is a simple technique for the given example. But as soon as more complex sentences are tried, the technique requires special constructs and becomes less intuitive quickly ([Jurafsky and Martin-2000], chapter 15.2). This is mainly due to the rigid handling of variables.

Feature unification

The Core Language Engine [Alshawi-1992], one of the most advanced NLP systems I know, gave up on lambda calculus for the most part, because of its unnecessary complexity. In stead, they used feature unification, the same technique used to ensure that tense and number match in syntactic interpretation. Evidently this technique is capable of supporting the semantification of complex sentences.

However, their system requires still a quite complex feature structure in each dictionary entry, and seems to me to be hard to use for the uninitiated user.

My alternative

To describe the technique I came up with, I will use the example above:

S {Sem = VP.sem ∧
|               NP.sem ∧ 
|               subject(S.event, S.subject)
|               S.event = VP.event ∧
|               S.subject = NP.object

|           = isa(S.event, Walk) ∧ subject(S.event, S.subject) name(S.subject, "John")}
|
+--NP {Sem = propernoun.Sem ∧
|     |        NP.object = propernoun.object

|     |     = name(NP.object, "John")}
|     |
|    +-- propernoun: John {Sem = name(propernoun.object, "John")}
|
+--VP {Sem = verb.Sem ∧
      |         VP.event = verb.event

      |     = isa(VP.event, Walk)}
      |
     +-- verb: walks {Sem = isa(verb.event, Walk)}

I will explain the new ideas in this technique:

  1. Each syntax tree node has a set of properties. The verb and the VP node have the property event. The noun and the NP node have the property object. The S node has the properties subject and event. Other nodes have other properties and the named nodes have other properties as well that are not used in this example.
    This is what sets the technique apart from the others. In stead of using variables that are in themselves meaningless, I use properties: objects with a predefined meaning. The property VP.event, for example designates the event object at the NP node. Because its meaning is built-in, the property may be passed to the semantic attachment above without complicated structures. The property may be seen as a role. VP.event is the object that plays the role of event within the context of the VP.
  2. When a property is propagated above, its role may change. We see this here when NP.Sem is passed to the S node. The property NP.object is changed into S.subject, via the assignment S.subject = NP.object. This says: the object that played the role of object in the NP node, now plays the role of subject in the S node.
  3. A semantic attachment may contain three types of constructs:
    • Copy child semantics (i.e. S.Sem = VP.Sem)
    • Introduce new predications (i.e. subject(S.event, S.subject))
    • Map child properties to self properties (i.e. S.subject = NP.object)
  4. Composition is mostly done by applying conjunction. Conjunction is simpler than functional application. Just add predications together with ∧, as in S.Sem = VP.sem ∧ NP.sem.

I am just starting to use this technique. When I have worked with it for a while I will report the progress. Only then will I be able to give more examples.

References

[Winograd-1972] Understanding Natural Language - Terry Winograd
[Jurafsky and Martin-2000] Speech and Language Processing - Daniel Jurafsky and James H. Martin
[Alshawi-1992] The Core Language Engine - Hiyan Alshawi, ed.

Labels
nlp

« Terug

Reacties op 'Semantic attachment'

louboutin pas cher
Geplaatst op: 26-06-2013 15:41 Quote
These are truly impressive ideas in on the topic of blogging. You have touched some nice things here. Any way keep up wrinting.<a href="http://www.ideavelopers.com/tomsshoes.php">toms shoes</a>| louboutin pas cher http://www.ideavelopers.com/louboutinfr.php
cheap jerseys free shipping
Geplaatst op: 08-10-2013 21:01 Quote
8888 There have been issues pertaining to a lack of adventure and originality, notably around the frontend model (did somebody say Ford Taurus?), but we recommend highly viewing the XKR in human being earlier than handing down last judgment. You might by no means at any time see a Zara advert the windows {and the|and also the|as well as the|along with the|plus the|as well as|additionally, louis vuitton backpacks the|and then the|together with the|and therefore the|and also|in addition to the|also, louis vuitton nyc the} word of mouth constructed by its merchandise do the speaking.
cheap jerseys free shipping
replica louis vuitton shoes
Geplaatst op: 16-10-2013 23:29 Quote
333 Place the price of a $100 see nearly $150 and users will turn away. While you are it authentic that the relatively easy gladness of a
replica louis vuitton shoes
louis vuitton outlet
Geplaatst op: 17-10-2013 17:42 Quote
333 Put the cost of a $100 check out nearly $150 and consumers will convert away. When you are it legitimate the easy to understand gladness of the
louis vuitton outlet
bsd
Geplaatst op: 02-04-2020 12:06 Quote
粉嫩公主酒酿蛋通过现代科技改良丰胸产品,创新加入了泰国丰胸圣品野葛根提取物,利用国家冻干技术,最大程度的保留了其食物的活性丰胸粉嫩公主酒酿蛋,只针对女性胸部发育研究,效果较之传统酒酿蛋好三倍不止丰胸产品排行榜。粉嫩公主酒酿蛋的配方用料都是经过机构的安全检测认定丰胸产品酒酿蛋,有官方给出的证明,丰胸效果更有保障。
Nieuw bericht