Abstract
It is a well-accepted idea that tactile stimulus location is automatically recoded from its place on the skin into a 3D location in space. Such "remapping" would facilitate integration of touch with visual signals and allow reaching movements to be planned to tactile and visual targets in a common manner. Much of the research on tactile-spatial coding has used choice paradigms with crossed hands or feet. In this posture, humans often confuse which of their limbs has been touched, and this finding has been taken to indicate a conflict between the stimulated limb's body side (e.g. right hand) and its current location in space (e.g., left side due to the crossed posture).
Recent findings from our lab, however, are incompatible with this view and suggest that crossing effects arise because tactile-spatial processing considers a touched limb's default position (e.g., the right space for the right hand) and, therefore, are due to 3D-spatial limb coding, not 3D-spatial stimulus coding. I will lay out this new concept of tactile coding, discuss how it accounts for the experimental evidence previously presumed to support the 3D-stimulus-remapping view, and advance the idea that movement planning to touch may proceed markedly differently from movement planning to visually perceived targets.
Please note that access to the campus in the Olympiapark from 4:30 p.m. is only possible with a work ID card/ZHS ID card. If you would like to take part in the event, please register at kerstin.laimgruber(at)tum.de
Zoom link: https://tum-conf.zoom.us/j/94253017197 Meeting-ID: 942 5301 7197, Kenncode: 033542