Olivier Georgeon's research blog—also known as the story of little Ernest, the developmental agent.

Keywords: situated cognition, constructivist learning, intrinsic motivation, bottom-up self-programming, individuation, theory of enaction, developmental learning, artificial sense-making, biologically inspired cognitive architectures, agnostic agents (without ontological assumptions about the environment).

Monday, December 20, 2010

Ernest 8.1 can eat

Ernest can now perform different actions with different body parts simultaneously. Namely, he can eat this delicious blue substance with his virtual mouth while enacting his usual wandering behavior with his antennas and virtual legs.

To obtain this result, Ernest was provided with a modular internal structure. Ernest now has an iconic module that implements his first skills to exploit his distal sensory system, and a "homeostatic" module that controls his homeostatic regulation behavior. Ernest 8.1, yet, only has one single homeostatic behavior: eating the blue substance. Each module has its own enaction canal, making simultaneity across modules possible.

The homeostatic primitive schema was added to the other primitive schemas listed previously:
- [Eat, succeed, 100] Ernest is crazy about eating.
and one iconic pattern was predefined:
- [0,0] Blue_icon (each pixel represents the distance to the blue square in the corresponding sensory field).

Ernest now also supports inborn composite schemas. In this experiment, these are:
- [Touch ahead empty, Move forward, 3]
- [Turn left toward empty, Move forward, 3]
- [Turn right toward empty, Move forward, 3]
- [Blue_icon, Eat, 3]

The three first inborn composite schemas are just provided to accelerate the initial learning. The fourth is provided to make Ernest eat when he senses that he is on a blue square. These four composite schemas were preset with an initial weight of 3.

The sensory system being linked to the homeostatic system through the [Blue_icon, Eat] composite schema makes Ernest 8.1's distal sensory system somehow resemble smell (in addition to vision) because the sensory system detecting the null distance to food triggers eating.

These developments were inspired by Joanna Bryson's paper Structuring Intelligence: The Role of Hierarchy, Modularity and Learning in Generating Intelligent Behaviour. Joanna's discussion on the agent's modular architecture inspired our implementation of Ernest's modules. Also, we follow her argument that sometimes it is ok to pre-encode the desired behavior into the agent.

Now the question is how to implement the effects that the satisfaction gained through eating should have on Ernest to make him exploit his distal sensors in the search for food.

Friday, December 10, 2010

Ernest 8.0 has rudimentary distal perception

Ernest 8.0's distal sensory system consists of two pixels that react to blue squares. Each pixel reflects 90° of Ernest's surrounding environment. Ernest's left-side pixel reflects the front-left 90° quadrant, and Ernest's right-side pixel reflects the front-right 90° quadrant.

We can think of such a sensory system as an initial visual system. The pixel's value reflects the amount of blue in the corresponding visual field. If there is only one blue square, the pixel's value reflects the blue square's distance from Ernest. Green squares are opaque, meaning that blue squares behind green squares are not detected.

In the beginning of this example video, Ernest is trained the same way as in previous experiments. Then, the environment is opened and a blue square is added. We can see the two pixels on Ernest's head that reflect the detected blue. The closer the blue square, the more vivid the corresponding blue pixel.

Ernest 8.0 does not yet use this sensory system to inform his behavior. Making bottom-up intrinsic motivation and distal sense work together will be one of our next challenges.

Monday, December 6, 2010

Java Ernest 7.2 in Vacuum

We put Ernest 7.2 back to the Vacuum environment to provide an example implementation of the Ernest Java class.

As we can see in this example video, the Java version is much faster than the Soar version. In fact, we had to slow the Java version down with timers to be able to see something.

This experiment uses the following settings:
- [move forward, succeed, 5] Ernest enjoys moving forward.
- [move forward, fail, -8] Ernest hates bumping walls.
- [turn left or right, succeed, 0] Ernest is indifferent of turning toward an empty square.
- [turn left or right, fail, -5] Ernest dislikes turning toward a wall.
- [touch, succeed, -1] Ernest slightly dislikes touching walls.
- [touch, fail, 0] Ernest is indifferent of touching empty squares.

Saturday, October 30, 2010

Java Ernest available

A Java version of Ernest is now available at http://code.google.com/p/e-ernest/. Many thanks to Mark Cohen for his contribution.

This version is fully available in read only. People willing to contribute to the project are very welcome. Please just contact me for full access.

Tuesday, September 21, 2010

The legend of Ernest 7.2

We are now working on moving Ernest to a 3D-rendering world, using Blender.

This preliminary video is intended as a legend to help interpret the Ernest 7.2's video shown in this older post.

The video shows Ernest's six primitive sensorymotor schemas, and illustrates the feedback that Ernest receives when enacting them in different situations:

- Touch ahead, to the right, and to the left. Touched squares flash yellow.
- Turn to the right or to the left.
- Move forward. When Ernest bumps into a wall, the wall flashes red.

Again, Ernest 7.2 has no visual perception of his environment. His only internal representation of his situation is that made through enacting sensorymotor schemas.

Friday, April 30, 2010

An Algorithm for Self-Motivated Hierarchical Sequence Learning

This six-page paper offers an overview of Ernest 7.2 algorithm and implementation:

Olivier L. Georgeon, Jonathan H. Morgan, Frank E. Ritter, (2010). An Algorithm for Self-Motivated Hierarchical Sequence Learning. Proceedings of the International Conference on Cognitive Modeling. Philadelphia, PA. ICCM-164, pp 73-78.