|Home||<< 1 >>|
Finkbeiner, M., & Friedman, J. (2011). The flexibility of nonconsciously deployed cognitive processes: Evidence from masked congruence priming. PLoS ONE, 6(2), e17095.
It is well accepted in the subliminal priming literature that task-level properties modulate nonconscious processes. For example, in tasks with a limited number of targets, subliminal priming effects are limited to primes that are physically similar to the targets. In contrast, when a large number of targets are used, subliminal priming effects are observed for primes that share a semantic (but not necessarily physical) relationship with the target. Findings such as these have led researchers to conclude that task-level properties can direct nonconscious processes to be deployed exclusively over central (semantic) or peripheral (physically specified) representations.
We find distinct patterns of masked priming for “novel” and “repeated” primes within a single task context. Novel primes never appear as targets and thus are not seen consciously in the experiment. Repeated primes do appear as targets, thereby lending themselves to the establishment of peripheral stimulus-response mappings. If the source of the masked priming effect were exclusively central or
peripheral, then both novel and repeated primes should yield similar patterns of priming. In contrast, we find that both novel and repeated primes produce robust, yet distinct, patterns of priming.
Our findings indicate that nonconsciously elicited cognitive processes can be flexibly deployed over both central and peripheral representations within a single task context. While we agree that task level properties can influence nonconscious processes, our findings sharply constrain the extent of this influence. Specifically, our findings are inconsistent with extant accounts which hold that the influence of task-level properties is strong enough to restrict the deployment of nonconsciously elicited cognitive processes to a single type of representation (i.e. central or peripheral).
Friedman, J., Brown, S., & Finkbeiner, M. (2013). Linking cognitive and reaching trajectories via intermittent movement control. Journal of Mathematical Psychology, 57(3-4), 140–151.
Abstract: Theories of decision-making have traditionally been constrained by reaction time data. A limitation of reaction time data, particularly for studying the temporal dynamics of cognitive processing, is that they index only the endpoint of the decision making process. Recently, physical reaching trajectories have been used as proxies for underlying mental trajectories through decision space. We suggest that this approach has been oversimplified: while it is possible for the motor control system to access the current state of the evidence accumulation process, this access is intermittent. Instead, we demonstrate how a model of arm movements that assumes intermittent, not continuous, access to the decision process is sufficient to describe the effects of stimulus quality and viewing time in curved reaching movements.
Zopf, R., Truong, S., Finkbeiner, M., Friedman, J., & Williams, M. A. (2011). Viewing and feeling touch modulates hand position for reaching. Neuropsychologia, 49(5), 1287–1293.
Abstract: Action requires knowledge of our body location in space. Here we asked if interactions with the external world prior to a reaching action influence how visual location information is used. We investigated if the temporal synchrony between viewing and feeling touch modulates the integration of visual and proprioceptive body location information for action. We manipulated the synchrony between viewing and feeling touch in the Rubber Hand Illusion paradigm prior to participants performing a ballistic reaching task to a visually specified target. When synchronous touch was given, reaching trajectories were significantly shifted compared to asynchronous touch. The direction of this shift suggests that touch influences the encoding of hand position for action. On the basis of this data and previous findings, we propose that the brain uses correlated cues from passive touch and vision to update its own position for action and experience of self-location.