toggle visibility Search & Display Options

Select All    Deselect All
 |   | 
Details
   print
  Records Links
Author (down) Zopf, Regine; Truong, Sandra; Finkbeiner, Matthew; Friedman, Jason; Williams, Mark A pdf  doi
openurl 
  Title Viewing and feeling touch modulates hand position for reaching Type Journal Article
  Year 2011 Publication Neuropsychologia Abbreviated Journal  
  Volume 49 Issue 5 Pages 1287–1293  
  Keywords  
  Abstract Action requires knowledge of our body location in space. Here we asked if interactions with the external world prior to a reaching action influence how visual location information is used. We investigated if the temporal synchrony between viewing and feeling touch modulates the integration of visual and proprioceptive body location information for action. We manipulated the synchrony between viewing and feeling touch in the Rubber Hand Illusion paradigm prior to participants performing a ballistic reaching task to a visually specified target. When synchronous touch was given, reaching trajectories were significantly shifted compared to asynchronous touch. The direction of this shift suggests that touch influences the encoding of hand position for action. On the basis of this data and previous findings, we propose that the brain uses correlated cues from passive touch and vision to update its own position for action and experience of self-location.  
  Address  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN ISBN Medium  
  Area Expedition Conference  
  Notes Approved no  
  Call Number Penn State @ write.to.jason @ Serial 23  
Permanent link to this record
 

 
Author (down) Zopf, R.; Friedman, J.; Williams, M.A. pdf  url
doi  openurl
  Title The plausibility of visual information for hand ownership modulates multisensory synchrony perception Type Journal Article
  Year 2015 Publication Abbreviated Journal Experimental Brain Research  
  Volume 233 Issue 8 Pages 2311-2321  
  Keywords Multisensory perception; Temporal synchrony perception; Virtual hand; Body representations; Body ownership; Sensory predictions  
  Abstract We are frequently changing the position of our bodies and body parts within complex environments. How does the brain keep track of one’s own body? Current models of body ownership state that visual body ownership cues such as viewed object form and orientation are combined with multisensory information to correctly identify one’s own body, estimate its current location and evoke an experience of body ownership. Within this framework, it may be possible that the brain relies on a separate perceptual analysis of body ownership cues (e.g. form, orientation, multisensory synchrony). Alternatively, these cues may interact in earlier stages of perceptual processing—visually derived body form and orientation cues may, for example, directly modulate temporal synchrony perception. The aim of the present study was to distinguish between these two alternatives. We employed a virtual hand set-up and psychophysical methods. In a two-interval force-choice task, participants were asked to detect temporal delays between executed index finger movements and observed movements. We found that body-specifying cues interact in perceptual processing. Specifically, we show that plausible visual information (both form and orientation) for one’s own body led to significantly better detection performance for small multisensory asynchronies compared to implausible visual information. We suggest that this perceptual modulation when visual information plausible for one’s own body is present is a consequence of body-specific sensory predictions.  
  Address  
  Corporate Author Thesis  
  Publisher Springer Berlin Heidelberg Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0014-4819 ISBN Medium  
  Area Expedition Conference  
  Notes Approved no  
  Call Number Serial 78  
Permanent link to this record
 

 
Author (down) Zacks, O.; Friedman, J. pdf  url
doi  openurl
  Title Analogies can speed up the motor learning process Type Journal Article
  Year 2020 Publication Scientific Reports Abbreviated Journal Sci Rep  
  Volume 10 Issue 1 Pages 6932  
  Keywords  
  Abstract Analogies have been shown to improve motor learning in various tasks and settings. In this study we tested whether applying analogies can shorten the motor learning process and induce insight and skill improvement in tasks that usually demand many hours of practice. Kinematic measures were used to quantify participant's skill and learning dynamics. For this purpose, we used a drawing task, in which subjects drew lines to connect dots, and a mirror game, in which subjects tracked a moving stimulus. After establishing a baseline, subjects were given an analogy, explicit instructions or no further instruction. We compared their improvement in skill (quantified by coarticulation or smoothness), accuracy and movement duration. Subjects in the analogy and explicit groups improved their coarticulation in the target task, while significant differences were found in the mirror game only at a slow movement frequency between analogy and controls.We conclude that a verbal analogy can be a useful tool for rapidly changing motor kinematics and movement strategy in some circumstances, although in the tasks selected it did not produce better performance in most measurements than explicit guidance. Furthermore, we observed that different movement facets may improve independently from others, and may be selectively affected by verbal instructions. These results suggest an important role for the type of instruction in motor learning.  
  Address Dept. of Physical Therapy, Stanley Steyer School of Health Professions, Sackler Faculty of Medicine, Tel Aviv University, Tel Aviv, Israel  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 2045-2322 ISBN Medium  
  Area Expedition Conference  
  Notes PMID:32332826; PMCID:PMC7181737 Approved no  
  Call Number Penn State @ write.to.jason @ Serial 105  
Permanent link to this record
 

 
Author (down) Wilf, M.; Korakin, A.; Bahat, Y.; Koren, O.; Galor, N.; Dagan, O.; Wright, W.G.; Friedman, J.; Plotnik, M. url  doi
openurl 
  Title Using virtual reality-based neurocognitive testing and eye tracking to study naturalistic cognitive-motor performance Type Journal Article
  Year 2024 Publication Neuropsychologia Abbreviated Journal Neuropsychologia  
  Volume 194 Issue Pages 108744  
  Keywords Humans; Aged; *Eye-Tracking Technology; Cognition; Executive Function; *Virtual Reality; Aging; Color trails test; Fall risk; Hand kinematics; Pupil; Virtual reality  
  Abstract Natural human behavior arises from continuous interactions between the cognitive and motor domains. However, assessments of cognitive abilities are typically conducted using pen and paper tests, i.e., in isolation from “real life” cognitive-motor behavior and in artificial contexts. In the current study, we aimed to assess cognitive-motor task performance in a more naturalistic setting while recording multiple motor and eye tracking signals. Specifically, we aimed to (i) delineate the contribution of cognitive and motor components to overall task performance and (ii) probe for a link between cognitive-motor performance and pupil size. To that end, we used a virtual reality (VR) adaptation of a well-established neurocognitive test for executive functions, the 'Color Trails Test' (CTT). The VR-CTT involves performing 3D reaching movements to follow a trail of numbered targets. To tease apart the cognitive and motor components of task performance, we included two additional conditions: a condition where participants only used their eyes to perform the CTT task (using an eye tracking device), incurring reduced motor demands, and a condition where participants manually tracked visually-cued targets without numbers on them, incurring reduced cognitive demands. Our results from a group of 30 older adults (>65) showed that reducing cognitive demands shortened completion times more extensively than reducing motor demands. Conditions with higher cognitive demands had longer target search time, as well as decreased movement execution velocity and head-hand coordination. We found larger pupil sizes in the more cognitively demanding conditions, and an inverse correlation between pupil size and completion times across individuals in all task conditions. Lastly, we found a possible link between VR-CTT performance measures and clinical signatures of participants (fallers versus non-fallers). In summary, performance and pupil parameters were mainly dependent on task cognitive load, while maintaining systematic interindividual differences. We suggest that this paradigm opens the possibility for more detailed profiling of individual cognitive-motor performance capabilities in older adults and other at-risk populations.  
  Address Center of Advanced Technologies in Rehabilitation, Sheba Medical Center, Israel; Department of Physiology and Pharmacology, Faculty of Medicine, Tel Aviv University, Tel Aviv, Israel; Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel. Electronic address: Meir.Plotnik@sheba.health.gov.il  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language English Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 0028-3932 ISBN Medium  
  Area Expedition Conference  
  Notes PMID:38072162 Approved no  
  Call Number Serial 123  
Permanent link to this record
 

 
Author (down) Thorpe, A.; Friedman, J.; Evans, S.; Nesbitt, K.; Eidels, A. pdf  url
doi  openurl
  Title Mouse Movement Trajectories as an Indicator of Cognitive Workload Type Journal Article
  Year 2022 Publication International Journal of Human-Computer Interaction Abbreviated Journal International Journal of Human-Computer Interaction  
  Volume 38 Issue 15 Pages 1464-1479  
  Keywords  
  Abstract Assessing the cognitive impact of user interfaces is a shared focus of human-computer interaction researchers and cognitive scientists. Methods of cognitive assessment based on data derived from the system itself, rather than external apparatus, have the potential to be applied in a range of scenarios. The current study applied methods of analyzing kinematics to mouse movements in a computer-based task, alongside the detection response task, a standard workload measure. Sixty-five participants completed a task in which stationary stimuli were tar;geted using a mouse, with a within-subjects factor of task workload based on the number of targets to be hovered over with the mouse (one/two), and a between-subjects factor based on whether both targets (exhaustive) or just one target (minimum-time) needed to be hovered over to complete a trial when two targets were presented. Mouse movement onset times were slower and mouse movement trajectories exhibited more submovements when two targets were presented, than when one target was presented. Responses to the detection response task were also slower in this condition, indicating higher cognitive workload. However, these differences were only found for participants in the exhaustive condition, suggesting those in the minimum-time condition were not affected by the presence of the second target. Mouse movement trajectory results agreed with other measures of workload and task performance. Our findings suggest this analysis can be applied to workload assessments in real-world scenarios.  
  Address  
  Corporate Author Thesis  
  Publisher Place of Publication Editor  
  Language Summary Language Original Title  
  Series Editor Series Title Abbreviated Series Title  
  Series Volume Series Issue Edition  
  ISSN 1044-7318 ISBN Medium  
  Area Expedition Conference  
  Notes Approved no  
  Call Number Serial 117  
Permanent link to this record
Select All    Deselect All
 |   | 
Details
   print

Save Citations:
Export Records: