a set of C++ library classes
for neural networks development



A simple programming example

The main steps necessary to build an application, i.e. to train and test a MLP for a classification task using NEURObjects, are:

  1. Calling general initializing functions
  2. Data sets preparation
  3. Building the desired MLP
  4. Building the desired learning algorithm
  5. Training the net
  6. Testing the net
Each of these steps requires only few lines of code.

Example of a minimal application using NEURObjects

It performs training and testing of a Multi-Layer perceptron (MLP) on a 6-class classification problem using the data sets trainfile and testfile, that are synthetic data file generated by the NEURObjects application dodata.
The MLP has one hidden layer and uses a simple backpropagation algorithm. This simple application outputs only the overall testing results.

Here is the source code:

#include"net.h"

int main(int argc, char* argv[])
{
  int      num_train;
  unsigned nclass   = 6;     // number of classes
  unsigned nhidden  = 5;     // number of hidden neurons
  unsigned num_attr = 3;     // dimension of attributes
  double   eta      = 0.02;  // learning rate
  unsigned iter     = 0;     // iteration number
  double   err      = 0.0;   // training error
 

  // 1. Initialization of look-up table for sigmoid activation function
  sigmoid_init();

  // 2. Building and preprocessing of training and test set
  num_train = wc ("trainfilename");
  TrainingSet trainset(num_attr, num_train, "trainfilename");
  trainset.normalize();

  num_train = wc ("testfilename");
  TrainingSet testset( num_attr, num_train, "testfilename");
  testset.normalize();

  // 3. Building a Two-Layer MLP with one hidden layer
  TwoLayer mynet(nclass, nhidden, num_attr);

  // 4. Backpropagation learning algorithm with fixed learning rate eta
  GradientDescent gd(eta);

  // 5. Weight initialization and by pattern training of the MLP
  mynet.init_weights_norm();
  mynet.Learn_by_pattern(trainset, gd, iter, err);

  // 6. Testing of the neural net and printing of the errors
  mynet.test(testset);
  mynet.print_errors(); cout << endl;
}	

You can get the source code of the example and the data sets

Compiling and linking the example:
g++ -I LEDA_include_directory -I NEURObjects_include_directory baseappl.cc -o mybaseappl -L path_of_NEURObjects_library -L path_of_LEDA_library -lNO -lL -lm

Note that the SHELL environment variable LD_LIBRARY_PATH must include the path to the shared libraries of NEURObjects and LEDA. Of course you can also perform a static link to the libraries using the option -static on the command line.

Alphabetic index Hierarchy of classes


Last Updated February 2001
For comments and suggestions mail to Giorgio Valentini