Special Coverage
The Contribution of Articulatory Gestures and Orthography to Speech Processing: Evidence from Novel Word Learning (31)
Chotiga Pattamadilok and Pauline Welby (Aix-Marseille University), Michael Tyler (Western Sydney University)
Summary by Brett Myers, Digital Content Associate Editor
This recap is part of a special series of session summaries
from the Psychonomic Society's 61st Annual Meeting. To read the rest of the series, click here.
When Words Speak Louder Than Actions
Chotiga Pattamadilok discusses
her recent research with Pauline Welby and Michael
Tyler regarding visual cues that contribute to speech
processing.
From an early age, we learn to associate speech sounds with
articulatory gestures and orthographic symbols. Pattamadilok points out that
articulatory gestures are associated with early stages of speech perception
(according to the motor theory), and written words are associated with later
stages of speech perception (according to connectionist models).

The team sought out to directly compare these two types of
visual cues and determine if they impact word learning in the same way.
They designed a study using minimal pairs of English
pseudo-words, which were associated with unknown objects. The participants were
native French speakers. They learned to associate each pseudo-word with an
object using one of three teaching styles: auditory only, auditory with visual
articulatory gesture, or auditory with written word.

The training included passive exposure to word-object
associations and active training, where participants heard a word and had to
identify the correct object from the minimal pair. After training, they
completed a test involving discrimination and picture-word matching tasks. They
were tested immediately after training and one day later.
Results showed that each training method had a similar impact on
task performance, where there was no benefit from audiovisual presentation in
the immediate posttest. However, one day later, auditory-only performance
declined, auditory-orthographic performance increased, and
auditory-articulatory remained stable.

Interestingly, seeing the written word with the auditory
presentation had the strongest residual effects on learning. You may have heard
that actions speak louder than words, but in the case of speech
perception, words speak
louder than actions.
|