Behavior-based robotics considers perception as a holistic process, strongly connected to behavioral needs of the robot. We present a bio-inspired framework for sensing-perception-action, applied to a roving robot in a random foraging task. Perception is here considered as a complex and emergent phenomenon where a huge amount of information coming from sensors is used to form an abstract and concise representation of the environment, useful to take a suitable action or sequence of actions. In this work a model for perceptual representation is formalized by means of RD-CNNs showing Turing patterns. They are used as attractive states for particular set of environmental conditions in order to associate, via a reinforcement learning, a proper action. Learning is also introduced at the afferent stage to shape the environment information according to the particular emerging pattern. The basins of attraction for the Turing patterns are so dynamically tuned by an unsupervised learning in order to form an internal, abstract and plastic representation of the environment, as recorded by the sensors.
|Titolo:||Turing Patterns In RD-CNNs For The Emergence Of Perceptual States In Roving Robots|
|Data di pubblicazione:||2007|
|Appare nelle tipologie:||1.1 Articolo in rivista|