Based on indications from the neuroscience and psychology, both perception and action can be internally simulated by activating sensor and motor areas in the brain without external sensory input or without any resulting overt behavior. This hypothesis, however, can be highly useful in the real robot applications. The robot, for instance, can cover some of the corrupted sensory inputs by replacing them with its internal simulation. The accuracy of this hypothesis is strongly based on the agent's experiences. As much as the agent knows about the environment, as much as it can build a strong internal representation about it. Although many works have been presented regarding to this hypothesis with various levels of success. At the sensorimotor abstraction level, where extracting data from the environment occur, however, none of them have so far used the robot's vision as a sensory input. In this study, vision-sensorimotor abstraction is presented through memory-based learning in a real mobile robot "Hemisson" to investigate the possibilities of explaining its inner world based on internal simulation of perception and action at the abstract level. The analysis of the experiments illustrate that our robot with vision sensory input has developed some kind of simple associations or anticipation mechanism through interacting with the environment, which enables, based on its history and the present situation, to guide its behavior in the absence of any external interaction.