|   | 
Details
   web
Records
Author (up) Zopf, R.; Friedman, J.; Williams, M.A.
Title The plausibility of visual information for hand ownership modulates multisensory synchrony perception Type Journal Article
Year 2015 Publication Abbreviated Journal Experimental Brain Research
Volume 233 Issue 8 Pages 2311-2321
Keywords Multisensory perception; Temporal synchrony perception; Virtual hand; Body representations; Body ownership; Sensory predictions
Abstract We are frequently changing the position of our bodies and body parts within complex environments. How does the brain keep track of one’s own body? Current models of body ownership state that visual body ownership cues such as viewed object form and orientation are combined with multisensory information to correctly identify one’s own body, estimate its current location and evoke an experience of body ownership. Within this framework, it may be possible that the brain relies on a separate perceptual analysis of body ownership cues (e.g. form, orientation, multisensory synchrony). Alternatively, these cues may interact in earlier stages of perceptual processing—visually derived body form and orientation cues may, for example, directly modulate temporal synchrony perception. The aim of the present study was to distinguish between these two alternatives. We employed a virtual hand set-up and psychophysical methods. In a two-interval force-choice task, participants were asked to detect temporal delays between executed index finger movements and observed movements. We found that body-specifying cues interact in perceptual processing. Specifically, we show that plausible visual information (both form and orientation) for one’s own body led to significantly better detection performance for small multisensory asynchronies compared to implausible visual information. We suggest that this perceptual modulation when visual information plausible for one’s own body is present is a consequence of body-specific sensory predictions.
Address
Corporate Author Thesis
Publisher Springer Berlin Heidelberg Place of Publication Editor
Language English Summary Language Original Title
Series Editor Series Title Abbreviated Series Title
Series Volume Series Issue Edition
ISSN 0014-4819 ISBN Medium
Area Expedition Conference
Notes Approved no
Call Number Serial 78
Permanent link to this record
 

 
Author (up) Zopf, Regine; Truong, Sandra; Finkbeiner, Matthew; Friedman, Jason; Williams, Mark A
Title Viewing and feeling touch modulates hand position for reaching Type Journal Article
Year 2011 Publication Neuropsychologia Abbreviated Journal
Volume 49 Issue 5 Pages 1287–1293
Keywords
Abstract Action requires knowledge of our body location in space. Here we asked if interactions with the external world prior to a reaching action influence how visual location information is used. We investigated if the temporal synchrony between viewing and feeling touch modulates the integration of visual and proprioceptive body location information for action. We manipulated the synchrony between viewing and feeling touch in the Rubber Hand Illusion paradigm prior to participants performing a ballistic reaching task to a visually specified target. When synchronous touch was given, reaching trajectories were significantly shifted compared to asynchronous touch. The direction of this shift suggests that touch influences the encoding of hand position for action. On the basis of this data and previous findings, we propose that the brain uses correlated cues from passive touch and vision to update its own position for action and experience of self-location.
Address
Corporate Author Thesis
Publisher Place of Publication Editor
Language Summary Language Original Title
Series Editor Series Title Abbreviated Series Title
Series Volume Series Issue Edition
ISSN ISBN Medium
Area Expedition Conference
Notes Approved no
Call Number Penn State @ write.to.jason @ Serial 23
Permanent link to this record