Word space models are used to encode the semantics of natural language elements by means of high dimensional vectors [23]. Latent Semantic Analysis (LSA) methodology [15] is well known and widely used for its generalization properties. Despite of its good performance in several applications, the model induced by LSA ignores dynamic changes in sentences meaning that depend on the order of the words, because it is based on a bag of words analysis. In this chapter we present a technique that exploits LSA-based semantic spaces and geometric algebra in order to obtain a sub-symbolic encoding of sentences taking into account the words sequence in the sentence. © 2014 Springer-Verlag Berlin Heidelberg.
A Geometric Algebra Based Distributional Model to Encode Sentences Semantics
Augello Agnese;Gentile Manuel;Pilato Giovanni;
2014
Abstract
Word space models are used to encode the semantics of natural language elements by means of high dimensional vectors [23]. Latent Semantic Analysis (LSA) methodology [15] is well known and widely used for its generalization properties. Despite of its good performance in several applications, the model induced by LSA ignores dynamic changes in sentences meaning that depend on the order of the words, because it is based on a bag of words analysis. In this chapter we present a technique that exploits LSA-based semantic spaces and geometric algebra in order to obtain a sub-symbolic encoding of sentences taking into account the words sequence in the sentence. © 2014 Springer-Verlag Berlin Heidelberg.| File | Dimensione | Formato | |
|---|---|---|---|
|
prod_282497-doc_81405.pdf
non disponibili
Descrizione: A geometric algebra based distributional model to encode sentences semantics
Tipologia:
Versione Editoriale (PDF)
Dimensione
582.02 kB
Formato
Adobe PDF
|
582.02 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


