Backpropagation
An algorithm for efficiently calculating the error
gradient of a neural network, which can then be used
as the basis of learning. Backpropagation is equivalent to the
delta rule for perceptrons, but can also calculate
appropriate weight changes for the hidden layer weights of
a multilayer perceptron by generalizing the notion of an error
correction term. In the simplest case, backpropagation is a type of
steepest descent in the search space of the network
weights, and it will usually converge to a local minimum.
Basin of Attraction
A region of state space in which all included states of a
dynamical system ultimately lead into the attractor.
Bias
See threshold.
Bifurcation
The splitting of a single mode of a system's behavior into two
new modes. This usually occurs as a function of a
continuously varying control parameter. A cascade of
bifurcations will usually precede the onset of chaos.
Binary
Written in a form that uses only 0s and 1s. A string of
bits.
Bit
The smallest unit of information; the answer to a yes/no question; the
outcome of a coin toss; a 0 or a 1.
Boid
An autonomous agent that behaves like a simplified bird but will
display flocking patterns in the presence of other boids.
Boolean
Taking only 0/1, true/false, yes/no values.
Bottom-Up
A description that uses the lower-level details to explain
higher-level patterns; related to reductionism.
Brown Noise/Brownian Motion
A form of randomness that is the result of cumulatively adding
white noise, to yield a random walk pattern.
Bucket Brigade Algorithm
A learning algorithm that is a method for adjusting the
strengths of the classifiers of a classifier system.
``Winning'' classifiers pay a portion of their earnings to other
classifiers that assisted them in being activated, similar to an
economic system.
Byte
Eight bits. In programming, often used to store a single text
character.