In this example video, Ernest 8.2 found a strategy that we named the tangential strategy.
The tangential strategy consists of approaching the blue square in a straight line as opposed to a diagonal line (the diagonal strategy in the previous example). The trick with the tangential strategy is that Ernest cannot know when he should turn toward the blue square until he passed it. The tangential strategy thus consists of moving on a straight line until the blue square disappears from the visual field, then returning one step backward, and then turning toward the blue square.
The emergence of a specific strategy occurs during Ernest's youth while he his babbling relatively randomly, in parallel to the emergence of goals. See the details in the next post. When Ernest has organized behavioral patterns that proved both satisfying and robust, he adopts them and stick to them as long as they work.
These results demonstrate that:
a) Ernest does not encode strategies nor task procedures defined by the programmer, as opposed to traditional cognitive modeling.
b) Ernest instances are capable of "individuating" themselves through their experience, i.e., acquiring their own cognitive individuality that was not encoded in their "genes". This accounts for the role of individual experience in cognitive development.
c) Ernest's goals emerge from his low-level drives. Eating blue squares appears to the observer as becoming the goal of Ernest' life while no representation of such goal was encoded into Ernest. Indeed, Ernest was given a high incentive to step on blue squares but this incentive was not different in nature from other primitive drives. Ernest's goals were not pre-encoded as they are, for example, in the goal buffer of the ACT-R architecture.
Olivier Georgeon's research blog—also known as the story of little Ernest, the developmental agent. Keywords: situated cognition, constructivist learning, intrinsic motivation, bottom-up self-programming, individuation, theory of enaction, developmental learning, artificial sense-making, biologically inspired cognitive architectures, agnostic agents (without ontological assumptions about the environment).
Friday, January 7, 2011
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment