Wednesday, January 4, 2012

Haptic anchoring and anchor redux: better felt and not seen for optimal conflux

Research by Alecuyer on what is known as "pseudo-haptic feedback" dramatically demonstrates the potential dominance of visual modality over haptic. When provided with contradictory feedback, such as seeing a distorted image of what we are touching, the brain will favor the visual image, especially in terms of determining the size or shape of the object in view. (On questions of texture or other material properties the balance may swing in the other direction.)

HICP/EHIEP "haptic integration" attempts to consistently shift perception toward "material properties" of a sound, away from its orthographic image, which, in turn, may be associated with inaccurate or underdeveloped pronunciation.  So, attention to the conflux of the visual shape of the word and its auditory properties must be secondary--as noted in several other posts based on other disciplines, e.g. Lessac. What that should accomplish is both more efficient encoding and anchoring of new sounds but also more effective "haptic monitoring" during spontaneous speech.

One of the most common reports from learners is the "return" of the clear, momentary felt sense of a sound being worked on either as it is pronounced more accurately or when it is still being used inaccurately, what we call "anchor redux." Those events, which are established and anticipated in the mind of the learner through several aspects of the system (what is termed, future pacing, in hypnotic work) are one of the basic benchmarks of HICP. Should you not see my point at this point . . . I'm sure you'll get a feel for it later . . . 

No comments:

Post a Comment