This study explores semantic and temporal relationship between spontaneous gestures and speech produced by 45 hearing children (age range 24-37 months) in a picture-naming task. Five pictures depicting objects and five pictures depicting actions which elicited more representational gestures were chosen for this analysis. Gestures produced were analyzed distinguishing between "action" gestures and "size and shape" gestures. The relationship between gesture and speech was analyzed in terms oft temporal synchrony/asynchrony, including mutual placement, and semantic match/mismatch, by using ELAN (EUDICO Linguistic Annotator). Results show that gestures and speech mainly convey similar meanings and that gestural stroke was produced synchronically with speech in the majority of cases. Onset time appears to be influenced by the item: gestural production tends to be produced before the spoken word onset in front of pictures depicting actions, but after in front of pictures depicting objects/animal. The results seem to support the Information Packaging Hypothesis (Alibali et al. 2000) where gestures and speech appear as an integrated system.
Semantic and temporal relationship between gestures and speech in a picture-naming task
Rinaldi Pasquale;Volterra Virginia
2010
Abstract
This study explores semantic and temporal relationship between spontaneous gestures and speech produced by 45 hearing children (age range 24-37 months) in a picture-naming task. Five pictures depicting objects and five pictures depicting actions which elicited more representational gestures were chosen for this analysis. Gestures produced were analyzed distinguishing between "action" gestures and "size and shape" gestures. The relationship between gesture and speech was analyzed in terms oft temporal synchrony/asynchrony, including mutual placement, and semantic match/mismatch, by using ELAN (EUDICO Linguistic Annotator). Results show that gestures and speech mainly convey similar meanings and that gestural stroke was produced synchronically with speech in the majority of cases. Onset time appears to be influenced by the item: gestural production tends to be produced before the spoken word onset in front of pictures depicting actions, but after in front of pictures depicting objects/animal. The results seem to support the Information Packaging Hypothesis (Alibali et al. 2000) where gestures and speech appear as an integrated system.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


