|Home||<< 1 2 3 4 5 6 7 >>|
Zopf, R., Friedman, J., & Williams, M. A. (2015). The plausibility of visual information for hand ownership modulates multisensory synchrony perception. Experimental Brain Research, 233(8), 2311–2321.
Abstract: We are frequently changing the position of our bodies and body parts within complex environments. How does the brain keep track of one’s own body? Current models of body ownership state that visual body ownership cues such as viewed object form and orientation are combined with multisensory information to correctly identify one’s own body, estimate its current location and evoke an experience of body ownership. Within this framework, it may be possible that the brain relies on a separate perceptual analysis of body ownership cues (e.g. form, orientation, multisensory synchrony). Alternatively, these cues may interact in earlier stages of perceptual processing—visually derived body form and orientation cues may, for example, directly modulate temporal synchrony perception. The aim of the present study was to distinguish between these two alternatives. We employed a virtual hand set-up and psychophysical methods. In a two-interval force-choice task, participants were asked to detect temporal delays between executed index finger movements and observed movements. We found that body-specifying cues interact in perceptual processing. Specifically, we show that plausible visual information (both form and orientation) for one’s own body led to significantly better detection performance for small multisensory asynchronies compared to implausible visual information. We suggest that this perceptual modulation when visual information plausible for one’s own body is present is a consequence of body-specific sensory predictions.
Awasthi, B., Sowman, P. F., Friedman, J., & Williams, M. A. (2013). Distinct spatial scale sensitivities for early categorisation of Faces and Places: Neuromagnetic and Behavioural Findings. Frontiers in Human Neuroscience, 7(91).
Abstract: Research exploring the role of spatial frequencies in rapid stimulus detection and categorisation report flexible reliance on specific spatial frequency bands. Here, through a set of behavioural and magnetoencephalography (MEG) experiments, we investigated the role of low spatial frequency (LSF)(25 cpf) information during the categorisation of faces and places. Reaction time measures revealed significantly faster categorisation of faces driven by LSF information, while rapid categorisation of places was facilitated by HSF information. The MEG study showed significantly earlier latency of the M170 component for LSF faces compared to HSF faces. Moreover, the M170 amplitude was larger for LSF faces than for LSF places, whereas the reverse pattern was evident for HSF faces and places. These results suggest that spatial frequency modulates the processing of category specific information for faces and places.