Our Projects

louis-reed-747388-unsplash.jpg
 
 

Haptics is a highly dimensional, multisensory, and multimodal goal-directed behavior that engages a large-scale distributed neural network. To generate stable representations of objects our brain recruits high-level cognitive functions that dynamically interact with low-level sensori-motor representations. Our lab studies the neural dynamics along this sensori-motor-cognitive axis using multifold technologies and animal models.

 
 
 

Haptic Perception & Control


 
 

Our ability to perceive and manipulate objects with our hands begins in the periphery where cutaneous and proprioceptive receptors encode and transmit object-related information to the central nervous system. These cutaneous (e.g., tactile signals) and proprioceptive inputs (e.g., spatial distribution of the fingers enclosing the object) are combined in selective functional ensembles which then give rise to holistic representations of that object.

 
 
 

Our lab studies these integration mechanisms during passive and active sensing conditions. Studies show that these integration effects are mediated by different sets of neural ensembles in the somatosensory system.

 
 

Some neural populations integrate tactile and proprioceptive signals using linear mechanisms, and occur early in the processing stream. The remaining set uses nonlinear integration mechanisms, and emerge from neural feedback from higher order areas. A major focus of our lab is to understand the role of these integration mechanisms in motion- and form-related tactile perception.

 
 
 
 

Goal-Directed Neural Mechanisms in Touch

 
 

Haptics is a multistage process that begins with the decision to reach towards an object (e.g., a glass with water) with the intention of manipulating that object and realizing a goal (e.g., drink water in glass). The neural implementation of this goal-directed process initiates in executive- control neural areas, which in turn activate neural ensembles in sensory cortices encoding relevant features of the to-be grasped object. Our studies show that top-down attention mediates selection of relevant features by enhancing the spike-spike synchrony between neural ensembles encoding the attended features.

synchrony_figure_revised.jpg
 
 
 

We have also developed theoretical models of how higher-order executive control neurons interact with low-level sensory cortices to mediate selection of behaviorally-relevant tactile features.

 
 
 

Cross-Modal Perception


 
 

Our ability to grasp and manipulate objects with our hands relies on integrating signals from vision, touch, and proprioceptive modalities. Our working hypothesis is that the decision to grasp an object triggers a set of internal predictive representations, mediated by visual signals, that are integral for installing action plans (e.g., estimating the object’s distance, shape, size, and temperature). Contact with the object gives rise to a holistic representation, which is derived from combining signals from visual, tactile, and proprioceptive neural ensembles encoding relevant features of the object (e.g., size, shape, and texture). This holistic object representation is dynamically integrated with kinesthesia and motor efference to facilitate real-time adjustments of the grasp. Studies show many commonalities between touch and vision in coding distinct aspects of objects (e.g., motion and shape). Our lab aims to study how these common cross-modal representations are integrated to form multisensory global representations of objects.