|Home||<< 1 2 3 4 5 6 7 >>|
Zacks, O., & Friedman, J. (2020). Analogies can speed up the motor learning process. Sci Rep, 10(1), 6932.
Abstract: Analogies have been shown to improve motor learning in various tasks and settings. In this study we tested whether applying analogies can shorten the motor learning process and induce insight and skill improvement in tasks that usually demand many hours of practice. Kinematic measures were used to quantify participant's skill and learning dynamics. For this purpose, we used a drawing task, in which subjects drew lines to connect dots, and a mirror game, in which subjects tracked a moving stimulus. After establishing a baseline, subjects were given an analogy, explicit instructions or no further instruction. We compared their improvement in skill (quantified by coarticulation or smoothness), accuracy and movement duration. Subjects in the analogy and explicit groups improved their coarticulation in the target task, while significant differences were found in the mirror game only at a slow movement frequency between analogy and controls.We conclude that a verbal analogy can be a useful tool for rapidly changing motor kinematics and movement strategy in some circumstances, although in the tasks selected it did not produce better performance in most measurements than explicit guidance. Furthermore, we observed that different movement facets may improve independently from others, and may be selectively affected by verbal instructions. These results suggest an important role for the type of instruction in motor learning.
Zopf, R., Friedman, J., & Williams, M. A. (2015). The plausibility of visual information for hand ownership modulates multisensory synchrony perception. Experimental Brain Research, 233(8), 2311–2321.
Abstract: We are frequently changing the position of our bodies and body parts within complex environments. How does the brain keep track of one’s own body? Current models of body ownership state that visual body ownership cues such as viewed object form and orientation are combined with multisensory information to correctly identify one’s own body, estimate its current location and evoke an experience of body ownership. Within this framework, it may be possible that the brain relies on a separate perceptual analysis of body ownership cues (e.g. form, orientation, multisensory synchrony). Alternatively, these cues may interact in earlier stages of perceptual processing—visually derived body form and orientation cues may, for example, directly modulate temporal synchrony perception. The aim of the present study was to distinguish between these two alternatives. We employed a virtual hand set-up and psychophysical methods. In a two-interval force-choice task, participants were asked to detect temporal delays between executed index finger movements and observed movements. We found that body-specifying cues interact in perceptual processing. Specifically, we show that plausible visual information (both form and orientation) for one’s own body led to significantly better detection performance for small multisensory asynchronies compared to implausible visual information. We suggest that this perceptual modulation when visual information plausible for one’s own body is present is a consequence of body-specific sensory predictions.