Two humanoid robots learn to predict each other’s behavior by inferring its intention. This project has been just started…
A humanoid robot learns to acquire a set of primitive behaviors and their combinations. The robot experiment utilizes MTRNN model which is characterized by its composition of fast and slow dynamics parts.
iCub robot is controlled by MTRNN. This was done by Martin Peniak at Univ. of Plymouth
A humanoid robot spontaneously generates sequences of learned primitives. Chaos self-organized in the higher level of artificial brain (MTRNN model) generates pseudo stochastic sequences of moving an object among left, middle and right positions on a table.
Mental rehearsing and planning based on partially learned behavioral experiences. Visual stream image is generated for actually experienced ones as well as hallucinated ones.
Pathology of schizophrenia (delusion of control) is reconstructed in a humanoid robot. The delusion of control is manifested under malfunction of top-down and bottom-up interactions in MTRNN.
A mobile robot with a hand learns to associate primitive sentences and corresponding behaviors with certain level of generalization. In the video, a robot, by recognizing a sentence “hit red”, generated the corresponding behavior. The robot was implemented with RNNPB model.