Tuesday, July 17, 2012

Touching tactile tactics for tapping new pronunciation?


Clip art: Clker
Clip art: Clker
Previous posts have alluded to the fact that students working with haptic-integrated pronunciation change often report beginning to "listen" with their bodies, as if they have recorded a word or phrase by "moving" with it or mirroring what was said. (Recent research on mirror neurons of course strongly supports that observation.) Two fascinating studies summarized by Science Daily address the underlying mechanisms which may be involved. One was conducted by researchers at Yale in which subjects were trained using a robotic device attached to their jaws to pronounce new sounds. As they did, they became substantially better at hearing them as well, noting that " . . . Learning to talk also changes the way speech sounds are heard. . . " Wow. The other, by a team at the University of British Columbia, basically "confused" subjects into thinking what they heard were aspirated consonants (when they actually heard voiced, unaspirated consonants)--by gently hitting them in the back of the neck with a small burst of air on targeted sounds. (That's right. Got to try that sometime!) The first was a bit more kinaesthetic than tactile; the second, decidedly more tactile. In both cases, the haptic or tactile "anchoring" dramatically affected perception of sounds. That is also the intent of the haptic-integrated protocols of the EHIEP system. The idea is to train learners to anchor haptically new sounds or patterns, what we call "MAMs" (more appropriate models--using movement and touch along with articulating the sound) at places in the visual field that are as "proprioceptively," visually and perceptually as distinct as possible from the learner's "inaccurate" or less appropriate current version of the sound. The summary of the latter study begins with this great line, "Humans use their whole bodies, not just their ears, to understand speech . . . " Really.

No comments:

Post a Comment