|Home||<< 1 2 3 4 5 6 7 >>|
Dempsey-Jones, H., Wesselink, D. B., Friedman, J., & Makin, T. R. (2019). Organized Toe Maps in Extreme Foot Users. Cell Reports, 28(11), 2748–2756.e4.
Abstract: Although the fine-grained features of topographic maps in the somatosensory cortex can be shaped by everyday experience, it is unknown whether behavior can support the expression of somatotopic maps where they do not typically occur. Unlike the fingers, represented in all primates, individuated toe maps have only been found in non-human primates. Using 1-mm resolution fMRI, we identify organized toe maps in two individuals born without either upper limb who use their feet to substitute missing hand function and even support their profession as foot artists. We demonstrate that the ordering and structure of the artists’ toe representation mimics typical hand representation. We further reveal “hand-like” features of activity patterns, not only in the foot area but also similarly in the missing hand area. We suggest humans may have an innate capacity for forming additional topographic maps that can be expressed with appropriate experience.
Zopf, R., Friedman, J., & Williams, M. A. (2015). The plausibility of visual information for hand ownership modulates multisensory synchrony perception. Experimental Brain Research, 233(8), 2311–2321.
Abstract: We are frequently changing the position of our bodies and body parts within complex environments. How does the brain keep track of one’s own body? Current models of body ownership state that visual body ownership cues such as viewed object form and orientation are combined with multisensory information to correctly identify one’s own body, estimate its current location and evoke an experience of body ownership. Within this framework, it may be possible that the brain relies on a separate perceptual analysis of body ownership cues (e.g. form, orientation, multisensory synchrony). Alternatively, these cues may interact in earlier stages of perceptual processing—visually derived body form and orientation cues may, for example, directly modulate temporal synchrony perception. The aim of the present study was to distinguish between these two alternatives. We employed a virtual hand set-up and psychophysical methods. In a two-interval force-choice task, participants were asked to detect temporal delays between executed index finger movements and observed movements. We found that body-specifying cues interact in perceptual processing. Specifically, we show that plausible visual information (both form and orientation) for one’s own body led to significantly better detection performance for small multisensory asynchronies compared to implausible visual information. We suggest that this perceptual modulation when visual information plausible for one’s own body is present is a consequence of body-specific sensory predictions.