When first programmed in BASIC and then in PL/1 Adaptron used a linear array to store the long term memory of its experiences. However as it has been modified to accommodate the more complex requirements its long term memory has evolved into a binary node artificial neural network (ANN). It is a very restricted form of semantic network. This is described in more detail in the paper about Perceptra, the pattern recognition part of Adaptron.
In the network each node represents a class or category and all the connections are “has-a” links. Its artificial neural nodes are binons. The word “binon” is a contraction of binary neuron. Each binon represents a specialized class made up of two, more general classes. The more general classes are shared and can be reused by any number of higher-level more specific classes. In this respect a binon is very similar to a perceptual schema in schema theory or a category prototype in prototype theory. Binons have two inputs and one output. The two inputs may occur in sequence or parallel.
- Binons are combined into multi-level binary trees to form a compositional hierarchy.
- The pattern recognition algorithm is syntactic (or structural) - not statistical (or decision theoretic).
Adaptron uses binons to recognize objects and sequences of events. Each binon represents an S or P-habit. Binons that recognize magnitude stimuli convert them into symbolic ones. Adaptron currently recognizes complex objects from a magnitude or symbolic sense using a linear array of dependent sensors. It recognizes these objects as having a shape and contrast independent of position, size, and brightness / intensity. The rules used for this process include:
- All the parts of an object change in unison
- Cells that fire together, wire together.
There are additional rules for recognizing objects independently of level of complexity.
For recognizing S-habits Adaptron uses a short term memory (STM). The STM builds up and holds a tree of S-habit binons before attention is paid to it.
Adaptron also dynamically builds a hierarchical action habit network integrated with the ANN. Action habits are represented by actons. The word “acton” is a contraction of action neuron. Actons are binary in that each one contains two sub-actons which can be performed in sequence, in parallel or repeated. The lowest level actons contain the responses that are sent to the action devices.
Adaptron’s long term memory is an ever changing and growing network of binons and actons.
The core of the Adaptron algorithm loops continuously through the following steps:
- Obtain stimuli from sensors and senses building up S and P-Habits.
- Execute all subconscious action habits based on perceived stimuli.
- Perform the action habit being practiced.
- Pay attention - to either the expected stimulus of the action habit being practiced or any distracting / unexpected stimulus.
- Think about the next expected stimulus based on the attended to one.
- Start doing any desirable action habit either in practice mode or subconsciously.
Numerous fundamental principles and algorithms for:
- converting graduated (sub-symbolic) stimuli into symbolic ones
- recognizing complex objects from their parts
- attracting attention
- practicing action habits
- performing action habits subconsciously and
have been discovered as part of the research and are used in Adaptron.
The current software uses:
- Event based time flow. There is no sense of time based on a timer or clock.
- An exploration value system based on novelty and boredom (unfamiliar and familiar stimuli). No research has yet been done on the use of an emotional value system based on pleasant and unpleasant stimuli.