Latash, M. L., Friedman, J., Kim, S.W., Feldman, A.G., Zatsiorsky, V.M. (2010). Prehension Synergies and Control with Referent Hand Configurations. Exp Brain Res, 202(1), 213–229.
Abstract: We used the framework of the equilibrium-point hypothesis (in its updated form based on the notion of referent configuration) to investigate the multi-digit synergies at two levels of a hypothetical hierarchy involved in prehensile actions. Synergies were analyzed at the thumb-virtual finger level (virtual finger is an imaginary digit with the mechanical action equivalent to that of the four actual fingers) and at the individual finger level. The subjects performed very quick vertical movements of a handle into a target. A load could be attached off-center to provide a pronation or supination torque. In a few trials, the handle was unexpectedly fixed to the table and the digits slipped off the sensors. In such trials, the hand stopped at a higher vertical position and rotated into pronation or supination depending on the expected torque. The aperture showed non-monotonic changes with a large, fast decrease and further increase, ending up with a smaller distance between the thumb and the fingers as compared to unperturbed trials. Multi-digit synergies were quantified using indices of co-variation between digit forces and moments of force across unperturbed trials. Prior to the lifting action, high synergy indices were observed at the individual finger level while modest indices were observed at the thumb-virtual finger level. During the lifting action, the synergies at the individual finger level disappeared while the synergy indices became higher at the thumb-virtual finger level. The results support the basic premise that, within a given task, setting a referent configuration may be described with a few referent values of variables that influence the equilibrium state, to which the system is attracted. Moreover, the referent configuration hypothesis can help interpret the data related to the trade-off between synergies at different hierarchical levels.
|
Kapur, S., Friedman, J., Zatsiorsky, V. M., & Latash, M. L. (2010). Finger interaction in a three-dimensional pressing task. Experimental Brain Research, 203(1), 101–118.
Abstract: Accurate control of forces produced by the fingers is essential for performing object manipulation. This study examines the indices of finger interaction when accurate time profiles of force are produced in different directions, while using one of the fingers or all four fingers of the hand. We hypothesized that patterns of unintended force production among shear force components may involve features not observed in the earlier studies of vertical force production. In particular, we expected to see unintended forces generated by non-task fingers not in the
direction on the instructed force but in the opposite direction as well as substantial force production in directions orthogonal to the instructed direction. We also tested a hypothesis that multi-finger synergies, quantified using the framework of the uncontrolled manifold hypothesis, will help reduce across-trials variance of both total force magnitude and direction. Young, healthy subjects were required to produce accurate ramps of force in five different directions by
pressing on force sensors with the fingers of the right (dominant) hand. The index finger induced the smallest unintended forces in non-task fingers. The little finger showed the smallest unintended forces when it was a non-task finger. Task fingers showed substantial force production in directions orthogonal to the intended force direction. During four-finger tasks, individual force vectors typically pointed off the task direction, with these deviations nearly
perfectly matched to produce a resultant force in the task direction. Multi-finger synergy indices reflected strong co-variation in the space of finger modes (commands to fingers) that reduced variability of the total force magnitude and direction across trials. The synergy indices increased in magnitude over the first 30% of the trial time and then stayed at a nearly constant level. The synergy index for stabilization of total force magnitude was higher for shear force components as compared to the downward pressing force component. The results suggest complex interactions between enslaving and synergic force adjustments, possibly reflecting the experience with everyday prehensile tasks. For the first time, the data document multi-finger synergies stabilizing both shear force magnitude and force vector direction. These synergies may play a major role in
stabilizing the hand action during object manipulation.
|
Nahab, F., Kundu, P., Gallea, C., Kakareka, J., Pursley, R., Pohida, T., et al. (2011). The neural processes underlying self-agency. Cerebral Cortex, 21(1), 48–55.
Abstract: Self-agency (SA) is the individual’s perception that an action is the consequence of his/her own intention. The neural networks underlying SA are not well understood. We carried out a novel, ecologically valid, virtual-reality experiment using BOLD-fMRI where SA could be modulated in real-time while subjects performed voluntary finger movements. Behavioral testing was also performed to assess the explicit judgment of SA. Twenty healthy volunteers completed the experiment. Results of the behavioral testing demonstrated paradigm validity along with the identification of a bias that led subjects to over- or underestimate the amount of control they had. The fMRI experiment identified two discrete networks. These leading and lagging networks likely represent a spatial and temporal flow of information, with the leading network serving the role of mismatch detection and the lagging network receiving this information and
mediating its elevation to conscious awareness, giving rise to SA.
|
Finkbeiner, M., & Friedman, J. (2011). The flexibility of nonconsciously deployed cognitive processes: Evidence from masked congruence priming. PLoS ONE, 6(2), e17095.
Abstract: Background
It is well accepted in the subliminal priming literature that task-level properties modulate nonconscious processes. For example, in tasks with a limited number of targets, subliminal priming effects are limited to primes that are physically similar to the targets. In contrast, when a large number of targets are used, subliminal priming effects are observed for primes that share a semantic (but not necessarily physical) relationship with the target. Findings such as these have led researchers to conclude that task-level properties can direct nonconscious processes to be deployed exclusively over central (semantic) or peripheral (physically specified) representations.
Principal Findings
We find distinct patterns of masked priming for “novel” and “repeated” primes within a single task context. Novel primes never appear as targets and thus are not seen consciously in the experiment. Repeated primes do appear as targets, thereby lending themselves to the establishment of peripheral stimulus-response mappings. If the source of the masked priming effect were exclusively central or
peripheral, then both novel and repeated primes should yield similar patterns of priming. In contrast, we find that both novel and repeated primes produce robust, yet distinct, patterns of priming.
Conclusions
Our findings indicate that nonconsciously elicited cognitive processes can be flexibly deployed over both central and peripheral representations within a single task context. While we agree that task level properties can influence nonconscious processes, our findings sharply constrain the extent of this influence. Specifically, our findings are inconsistent with extant accounts which hold that the influence of task-level properties is strong enough to restrict the deployment of nonconsciously elicited cognitive processes to a single type of representation (i.e. central or peripheral).
|
Zopf, R., Truong, S., Finkbeiner, M., Friedman, J., & Williams, M. A. (2011). Viewing and feeling touch modulates hand position for reaching. Neuropsychologia, 49(5), 1287–1293.
Abstract: Action requires knowledge of our body location in space. Here we asked if interactions with the external world prior to a reaching action influence how visual location information is used. We investigated if the temporal synchrony between viewing and feeling touch modulates the integration of visual and proprioceptive body location information for action. We manipulated the synchrony between viewing and feeling touch in the Rubber Hand Illusion paradigm prior to participants performing a ballistic reaching task to a visually specified target. When synchronous touch was given, reaching trajectories were significantly shifted compared to asynchronous touch. The direction of this shift suggests that touch influences the encoding of hand position for action. On the basis of this data and previous findings, we propose that the brain uses correlated cues from passive touch and vision to update its own position for action and experience of self-location.
|