mlp - train a multilayer perceptron with backprop
mlp [-dfile string] [-steps integer] [-seed integer]
[-freq integer] [-numin integer] [-numhid integer]
[-numout integer] [-lrate double] [-mrate double]
[-winit double] [-linout] [-pdump] [-gdump]
Train a multilayer perceptron with a single hidden layer
of neurons on a set of data contained in a file using the
backpropagation learning algorithm with momentum. Output
units can be linear or sigmoidal, allowing you to model
both discrete and continuous output target values.
Training data file.
Number of simulated steps.
Random seed for initial state.
Status print frequency.
Number of inputs.
Number of hidden nodes.
Number of outputs.
Weight init factor
Use linear outputs?
-pdump Dump patterns at end of run?
-gdump Dump gnuplot commands at end?
The number of inputs and outputs must agree with the for-
mat of your training data file. The program expects to
find training patterns listed one after another with each
training pattern consisting of the inputs followed by the
If the -pdump switch is used, then the patterns will
printed to stdout. Hence,redirect this to a file if you
want to save it.
You should always use linear outputs if your target values
The error value displayed via stderr is the root mean
squared error taken over the entire data step. Calculat-
ing this error measure is typically far more expensive
than a single training step, so you may wish to use the
-freq option to make it happen less frequently.
If you network doesn't converge to anything useful, try
increasing the number of hidden nodes. Moreover, you may
need to tweak the learning rate and momentum term. This
is just one of the curses of backprop.
The -gplot switch isn't very useful as it only works on
the first output neuron.
No sanity checks are performed to make sure that any of
the options make sense.
Copyright (c) 1997, Gary William Flake.
Permission granted for any use according to the standard
GNU ``copyleft'' agreement provided that the author's com-
ments are neither modified nor removed. No warranty is
given or implied.