Skip to main content

COMPUTATIONAL MODELING OF ACTION RECOGNITION IN THE PRIMATE BRAIN

Overview

This project is titled “Neurophysiology and computational modeling of action-observation”, and is bilaterally supported by Turkish (TUBITAK) and Greek governments. The execution is undertaken by the PIs of Erhan Oztop (Turkey side, computational modeling) and Vassilis Raos (Greek side, neurophysiology).

Social cognition involves processes allowing individuals to understand the actions, intentions and emotions of others. Last two decades witnessed the discovery and investigation of a population of neurons, called mirror neurons (MNS), that may hold the key to understanding the neural bases of social cognition. These neurons were found in area F5 of the ventral premotor cortex and fire both when a monkey grasps 3-D objects and when he observes humans executing the same movements (Di Pellegrino ve ark. 1992, Gallese ve ark. 1996, Rizzolatti ve ark. 1996). They were named “mirror neurons” because their activity in the brain of the motionless observing monkey seemed to mirror that of motor neurons active in the person actually executing the movement.

This project stresses that MNs can offer insight into the social brain only when careful experimentation and computational scrutiny is employed to reach a solid scientific understanding. Further neurophysiology experiments are needed to evaluate the selectivity of mirror neurons in non-human primates quantitatively, by assessing the relationship between neuronal discharge and the precise kinematics of the movements performed and observed by the monkey.

We use a decoding framework to investigate the responses of mirror neurons recorded from the area of F5. The object being grasped by the demonstrator or the monkey herself can be readily decoded with a few neurons. What is more challenging is to answer whether these neurons represent -as often claimed- a motor code for the action. If the answer is yes, then the kinematic parameters of an ensuing action (either performed by the monkey herself or the demonstrator) must be predicted based on the mirror neuron activity in a temporarily locked manner. To discover the answer we are currently using computer vision techniques to obtain the hand position information from the video recordings of the monkey performing the grasping actions. Our initial analysis indicate that some neurons may be engaged in object centered encoding of actions.

Related publications

Oztop E, Kawato M, Arbib M (2013) Mirror Neurons: Functions, Mechanisms and Models. Neuroscience Letters 540: 43-55

Oztop E, Kawato M, Arbib M (2006) Mirror neurons and imitation: A computationally guided review. Neural Networks 19: 254-271

Oztop E, Wolpert D, Kawato M (2005) Mental state inference using visual control parameters. Cognitive Brain Research 22: 129-151

Oztop E, Bradley NS, Arbib MA (2004) Infant grasp learning: a computational model, Exp Brain Res. 158:480-503

Oztop E., Arbib MA (2002) Schema design and implementation of the grasp-related mirror neuron system. Biological Cybernetics 87: (2) 116-140