|
Zopf, R., Friedman, J., & Williams, M. A. (2015). The plausibility of visual information for hand ownership modulates multisensory synchrony perception. Experimental Brain Research, 233(8), 2311–2321.
Abstract: We are frequently changing the position of our bodies and body parts within complex environments. How does the brain keep track of one’s own body? Current models of body ownership state that visual body ownership cues such as viewed object form and orientation are combined with multisensory information to correctly identify one’s own body, estimate its current location and evoke an experience of body ownership. Within this framework, it may be possible that the brain relies on a separate perceptual analysis of body ownership cues (e.g. form, orientation, multisensory synchrony). Alternatively, these cues may interact in earlier stages of perceptual processing—visually derived body form and orientation cues may, for example, directly modulate temporal synchrony perception. The aim of the present study was to distinguish between these two alternatives. We employed a virtual hand set-up and psychophysical methods. In a two-interval force-choice task, participants were asked to detect temporal delays between executed index finger movements and observed movements. We found that body-specifying cues interact in perceptual processing. Specifically, we show that plausible visual information (both form and orientation) for one’s own body led to significantly better detection performance for small multisensory asynchronies compared to implausible visual information. We suggest that this perceptual modulation when visual information plausible for one’s own body is present is a consequence of body-specific sensory predictions.
|
|
|
Zopf, R., Truong, S., Finkbeiner, M., Friedman, J., & Williams, M. A. (2011). Viewing and feeling touch modulates hand position for reaching. Neuropsychologia, 49(5), 1287–1293.
Abstract: Action requires knowledge of our body location in space. Here we asked if interactions with the external world prior to a reaching action influence how visual location information is used. We investigated if the temporal synchrony between viewing and feeling touch modulates the integration of visual and proprioceptive body location information for action. We manipulated the synchrony between viewing and feeling touch in the Rubber Hand Illusion paradigm prior to participants performing a ballistic reaching task to a visually specified target. When synchronous touch was given, reaching trajectories were significantly shifted compared to asynchronous touch. The direction of this shift suggests that touch influences the encoding of hand position for action. On the basis of this data and previous findings, we propose that the brain uses correlated cues from passive touch and vision to update its own position for action and experience of self-location.
|
|