Tuesday, April 19, 2016

Gesture the cause of pronunciation problems?

That's right! You should try it! Here's why . . .

 Referring to ways in which learners' L1s differs from their L2s is generally not a priority in pronunciation teaching--or in general language instruction. In some contexts, however, especially EFL-like courses where phonetics or translation serve as the point of departure, the structure of the L1 may be among the early topics addressed. For a number of reasons, nonetheless, many contemporary methodologists avoid it. A quick, informal poll among colleagues recently came up with a nice range of opinion:

"Why confuse things?"
"Best avoided."
"Not that confident, myself."
"May cause even more interference."     

That last comment is interesting. Clearly, if not done carefully or well, that could be the case. So, how might you "do that well?" (If you have some suggestions in that regard, in addition to the one I am about to recommend, please post a comment w/it!)

In haptic pronunciation teaching, we often and very effectively lead learners across "gestural bridges" between L1 and L2 phonological elements, such as individual sounds (vowels and consonants), rhythm patterns and tone movement (intonation). We do that by having learners mirror us or a video  as they perform "pedagogical movement patterns" (PMPs),  gestures synchronized speaking, that represent both the L1 and L2 sounds or sound patterns--and often the relative distance between them--in the visual space in front of the learner. 

Recently published research by Carlson, Jacobs, Perry and Church in Gesture, The effect of gestured instruction on the learning of physical causality problems, suggests why the "contrastive haptic PMP approach" may work. (Now granted, the analogy between video instruction on how gears work and the relationship between how an L1 sound is physically articulated and that of its L2 near-equivalent--that may cause serious interference or negative transfer--may be something of a stretch! But stick with me here!)

In the study, subjects either viewed a video where the instructor (a) explained the process without gesturing or (b) the "speech plus gesture" protocol.  Their conclusion: 

"Results showed that . . .  instruction was . . .  significantly more effective when gesture was added. These findings shed light on the role of gesture input in adult learning and carry implications for how gesture may be utilized in asynchronous instruction with adults."

What the conclusion misses, but is unpacked in the article, is the potential importance of the nature of the concept being taught in the first place, as it says in the title: physical causality, meaning that the contact and motion of one  gear as it affected the state and movement of the other gear. In other words, the impact of the gestural protocol was so pronounced, in part, because it was portraying and embodying a physical process.

Studies of the connection of gesture to more abstract, far less embodied concepts such as interpretation of emotion or intent are much less consistent, understandably. Pronunciation of a language is, on the other hand, an essentially physical, somatic process. Hence, using gesture (and touch) to anchor it makes perfect sense. 

Just thought I'd point that out . .




1 comment:

  1. I agree that gesture helps. Big muscles helping small ones.

    ReplyDelete