Document Type : Research Paper

Authors

1 PhD of Motor Behavior, Shahid Beheshti University of Tehran

2 Associate Professor Faculty of Physical Education and Sport Sciences at Shahid Beheshti University of Tehran

3 Assistant Professor, Rehabilitation Research Center, School of Rehabilitation, Department of Rehabilitation Basic Sciences, Iran University of Medical Sciences, Tehran, Iran

Abstract

This study investigated the effect of audiovisual integration on perception and action and perception-to-action transfer. For this purpose, 30 subjects were selected and randomly divided to three groups: visual, visual-auditory (single-channel), visual-auditory (two-channel). Subjects in the visual group watched the pattern of a skilled basketball player, while the other groups were provided with the sonification of elbow angular velocity (one- channel group) and elbow and wrist angular velocity (two-channel group) in addition to watching the pattern. The pattern was presented to subjects five times and participants answered ten questions about different aspects of the pattern. Eventually, they participated in parameter recognition and pattern recongnition tests. For each elbow and wrist joints, four variables were identified: Maximum flexion angular velocity Error (MfavE), Maximum extention angular velocity Error (MeavE), Maximum range Error of flexion (MrEf) and Total Parameter time (TPt). Results showed that on “percent confidence reply” and “reply to questions” there was significant difference between experimental groups. Results showed that in movement reproduction stage in variables (MeavE) and (TPt) for both elbow and wrist joints, there was significant difference between groups in favor of the audiovisual groups. This study supports the positive effect of visual-auditory integration on perception and reproduction. Moreover, positive perception-to-action transfer was confirmed for two sensory groups. The results are consistent with modality appropriateness hypothesis. The results indicate that due to certain temporal adjustments in basketball jump shot task, the modality that dominates the perception in the context of this task is auditory.

Keywords

Main Subjects

  1. Williams A M, Davids K, Williams J G. Visual perception and action in sport. 1st ed. London: Routledge; 1999; 7-9.
  2. Hodges N, Chua R, Franks I M. The role of video in facilitating perception and action of a novel coordination movement. Journal of Motor Behavior. 2003; 35(3): 247-‌60.
  3. Swinnen S P. Information feedback for motor skill learning: A review. In H. N. Zelaznik (Ed.), Advances in Motor Learning and Control (Pp. 37-66). Champaign, IL: Human Kinetics; 1996.
  4. Carroll W R, Bandura A. Representational guidance of action production in observational in learning: A causal analysis. Journal of Motor Behavior. 1999; 22(1): 85-97.
  5. Hodges N J, Franks I M. Learning as a function of coordination bias: Building upon pre-practice behaviours. Human Movement Science. 2002; 21(2): 231-‌58.
  6. Hodges N, Chua R, Franks I M. The role of video in facilitating perception and action of a novel coordination movement. Journal of Motor Behavior. 2003; 35(3): 247-‌60.
  7. Mechsner F, Kerzel D, Knoblich G, Prinz W. Perceptual basis of bimanual coordination. Nature. 2001; 414(6859), 69-72.
  8. Vogt K. On relations between perceiving, imagining and performing in the learning of cyclical movement sequences. British Journal of Psychology. 1995; 86(2): 191‌–‌216.
  9. Bekkering H. Imitation: Common mechanisms in the observation and execution of finger and mouth movements. In A. N. Meltzoff, & W. Prinz (Eds.), The Imitative Mind: Development, Evolution and ‌Brain Bases. Cambridge, MA: Cambridge University Press; 2002; 163-82.
  10. Ishimura G, Shimojo S. Voluntary action captures visual motion. Investigative Ophthlamology and Motion Science (Suppl.). 1994; 35: 1275.
  11. Wohlschläger A. Visual motion priming by invisible actions. Vision Research. 2000; 40(8): 925–‌30.
  12. Engel A, Burke M, Fiehler K, Bien S, Rosler F. What activates the human mirror neuron system during observation of artificial movements: Bottom-up visual features or top-down intentions? Neuropsychologia. 2008; 46 (7): 2033-‌42.
  13. Casile A, Giese M A. Nonvisual motor training influences biological motion perception. Curr Biol. 2006; 16 (1): 69-74.
  14. Lahav A, Saltzman E, Schlaug G. Action representation of sound: Audiomotor recognition network while listening to newly acquired actions. Journal of Neuroscince. 2007; 27 (2): 308-‌14.
  15. Young W, Rodger M, Craig C M. Perceiving and reenacting spatiotemporal characteristics of walking sounds. Journal of Experimental Psychology, Human Perception and Performance. 2012; 39 (2): 464-‌76.
  16. Eldridge A. Issues in auditory display. Artificial Life. 2006; 12(2): 259-‌74.
  17. Grond F, Hermann T, Verfaille V, Wanderley M. Gesture in embodied communication and human computer interaction. LNAI 5934, Chapter methods for effective sonification of clarinetists, ancillary gestures. 1thed. Verlag-Berlin Heidelberg: Springer; 2010. p. 171-‌81‌.
  18. Secoli R, Milot M, Rosati G, Reinkensmeyer D. Effect of visual distraction and auditory feedback on patient effort during robot-assisted movement training after stroke. Journal of Neuro-Engineering and Rehabilitation. 2011; 8(1): 21.
  19. Welch R B, Duttonhurt L D, Warren D H. Contributions of audition and vision to temporal rate perception. Perception & Psychophysics. 1986; 39(4): 294–‌300.
  20. Welch R B, Warren D H. Immediate perceptual response to intersensory discrepancy. Psychological Bulletin. 1980; 88(3): 638-‌67.
  21. Keller J M, Prather E E, Boynton W V, Enos H L, Jones L V, Pompea S M, et al. Educational testing of an auditory display regarding seasonal variation of Martian polar ice caps. Proceedings of the International Conference on Auditory Display; 7-9 July 2003; Boston: International Community on Auditory Display; 2003. P. 212-‌15.
  22. Hermann T, Honer O, Ritter H. Acoumotion‌–‌an interactive sonification system for acoustic motion control. Springer-Verlag Berlin Heidelberg; 2006. P. 312–‌23.
  23. Effenbert A. Movement sonification‌: Effects on perception and action.‌‌ IEEE Multimedia. 2005; 12(2): 53-9.
  24. Effenberg A. Multimodal convergent information enhances perception accuracy of human movement patterns.‌ Proceedings of the‌ 6th Annual Congress of the European College of Sports Science (ECSS); July 24-28 2001; Cologne: Sport und Buch Strauss; 2001. P. 122.
  25. Effenberg A, Mechling H. Multimodal convergent information enhances reproduction accuracy of sport movements. Proceedings of the 8th Annual Congress of the European College of Sport Science (ECSS); 9-12 July 2003; Salzburg: ECSS; 2003. P. 196-‌7.
  26. Henkelmann Ch. Improving the Aesthetic quality of realtime motion data sonification. Thesis. Computer Science Department, University of Bonn; 2007. P. 164.
  27. Schaffert N, Barrass K, Effenberg A O. Exploring function and aesthetics in sonification for elite sports. In Proceedings of the Second International Conference on Music Communication Science; 3-4 December 2009; Sydney, Australia.
  28. Schmidt G, Mohammadi B, Hammer A, Heldmann M, Samii A, Munte T F, et al. Observation of sonified movements engages a basal ganglia frontocortical network. Neuroscince. 2013; 14(32): 1-11.
  29. Vinken P M, Kroger D, Fehse U, Schmitz G, Brock H, Effenberg A O. Auditory coding of human movement kinematics. Multisensory Research. 2013; 26(6): 533-‌52.
  30. Effenberg A, Feshe U, Weber A. Movement sonification: Audiovisual benefits on motor learning. BIO Web of Conferences, 15 Decamber 2011; Berlin: EDP Scinences; 2011. P. 1-5.
  31. Rojas F J, Cepero M, ONÄ A A, Gutierrez M. Kinematic adjustments in the basketball jump shot against an opponent. Ergonomices. 2000; 43(10): 1651-‌60
  32. Walker B N, Cothran J T. Sonification sandbox a graphical toolkit for auditory graphs. Proceedings of the International Conference on Auditory Display; 6-9 July 2003; Boston. 2003; P. 1-3.
  33. Gangopadhyay N, Madary M, Spicer F. Perception, action and consciousness. 1thed. England: Oxford University Press; Part one; 2010. p. 1-18
  34. Bidet-Caulet A, Voisin J, Bertrand O, Fonlupt P. Listening to a walking human activates the temporal biological motion area. Neuroimage. 2005; 28(1): 132-‌9.
  35. Ladavas E. Multisensory-based approach to the recovery of unisensory deficit. Ann NY Acad Sci. 2008; 1124(1): 98-110.
  36. Beauchamp M S. See me, hear me, touch me: Multisensory integration in lateral occipital-temporal cortex. Curr Opin Neurobiol. 2005; 15(2): 145-‌53.
  37. Nesbitt K. Designing multi-sensory displays for abstract data (Doctoral thesis). School of Information Technologies, University of Sydney‌; 2003.
  38. Walker J T, Scott K J. Auditory-visual conflicts in the perceived duration of lights, tones, and gaps. Journal of‌ Experimental Psychology: Human Perception and Performance. 1981; 7(6): 1327–‌39.
  39. Kapur A, Tzanetakis G, Virji-Babul N, Wang G, Cook P R. A framework for sonification of vicon motion capture data. In Proceedings of the 8th Conference on Digital Audio Effects; 20-22 September 2005; Mardin, Spain. 2005; 1-6.
  40. Carson R G, Kelso J A. Governing coordination: Behavioural principles and neural correlates. Experimental Brain Research. 2004; 154(3): 267–‌74.
  41. Lee T D. Motor control in everyday actions. Translation: ‌Ramezanzade H & Abedanzade R. 1thed.‌ Tehran: Bamdad Ketab; 2012. p. 39-48.
  42. Rizzolatti G, Sinigalia C. The functional role of the parieto-frontal mirror circuit: Interpretations and misinterpretations. Nat Rev Neurosci. 2010; 11(4): 264-‌74.
  43. Csibra G. Action mirroring and action understanding: An alternative account. In the Haggard, R. Rossetti, & M. Kawato (Eds.). sensorimotor foundations of higher cognition: Attention and performance. New York: Oxford University Press‌; 2007. P. 435–‌59.
  44. Scheef L, Boecker H, Daamen M, Fehse U, Landsberg M W, Granath D O, et al. Multimodal motion processing in area V5/ MT: Evidence from an artificial class of audio-visual events. Brain Research. 2009; 1252(1): 94-104.