A WYSWYD integration meeting took place in Barcelona, during the BCBT summer school. 3-6 September 2014
During this period the consortium discussed and worked together on the integration of the different working pakages.
A practical outcome of the work done was an example of how the iCub robot learned autonomously the auditory-motor relation he entartains with a musical keyboard.
iCub - Extraction of Sensory-Motor contingencies
The iCub is a humanoid robot that posses a large set of sensors and effectors designed to be used as a research platform for cognitive development. In this context, the interaction between these systems can generate rich sensory-motor data allowing for extraction of contingencies.
We developed a library for multi modal integration inspired from the Convergence Divergence Framework of Damasio and implemented using a multimodal self organizing neural network (Multi Modal Convergence Map, MMCM).. We will demonstrate and test the MMCM algorithm by extracting the regularities in the sensory motor stream in different setups (visuo-motor, visuo-auditory, motor-haptic, etc).
Our initial plan is to reproduce the rubber hand illusion experiment in the robot and then move on to other modalities in other scenarios such as self-tickling cancellation or vocabulary grounding.
Meyer, K., & Damasio, A. (2009). Convergence and divergence in a neural architecture for recognition and memory. Trends in Neurosciences, 32(7), 376–382. doi:10.1016/j.tins.2009.04.002
Lallee, S., & Dominey, P. F. (2013). Multi-modal convergence maps: from body schema and self-representation to mental imagery. Adaptive Behavior, 21(4), 274–285. doi:10.1177/1059712313488423