Abb. INB-Logo

Tools + Demos


Zurück / back

SoftDoubleMaxMinOver:

Perceptron-Like Training of Support Vector Machines

 

1. What is SoftDoubleMaxMinOver ?

The SoftDoubleMaxMinOver (1) learning algorithm is a modification of the perceptron learning algorithm (2) that converges to the maximum margin solution known from the support vector machine approach (6). It is closely related to the MinOver algorithm (3),(4). The DoubleMinOver algorithm treats the data classes seperately in contrast to MinOver where just one class of augmented patterns is considered. Due to this modification one can prove that in contrast to the MinOver algorithm the DoubleMinOver algorithm learns exactly the same class bound as a standard SVM training algorithm such as SMO.

DoubleMaxMinOver is an extension of DoubleMinOver and allows to learn the so-called support vectors by introducing a forgetting process which removes non-support vectors from the solution (5).

SoftDoubleMaxMinOver is a soft margin generalization of DoubleMaxMinOver that currently can employ two error models. By considering a set of averaged training vectors instead of the primal training data, one obtains a soft margin classification that is identical to the nu-SVM (7) approach. Another possibility is to introduce a modified kernel function that maps the soft margin problem in the primal kernel space to a hard margin problem in a modified kernel space. This approach is well known from SVM theory too (2-norm error model).

Here a pure OCTAVE/MATLAB implementation of this algorithm as well as a OCTAVE/MATLAB package based on the work in (8) is presented. The package contains an optimized implementation of the SoftDoubleMaxMinOver learning algorithm. It provides an implementation of both error models. Due to computational reasons the latter is preferred as default.

 

2. Installation

(1) Download the SoftDoubleMaxMinOver OCTAVE/MATLAB package

(2) Extract the tar archive

The archive contains precompiled binaries for linux x86-32 and x86-64 architecture for MATLAB as well as linux x86-64 binaries for OCTAVE. For other platforms/operating systems, you have to compile the sources:

(3) Add the -msse2 compiler option to the CFLAGS variable of the MATLAB MEX compiler (mexopt.sh , only required for MATLAB compilation)

(4) Enter the DMMO directory

(5) Compile the C sources by typing make all . Note that the MATLAB MEX compiler has to be in the path.

Note 1: If you don't want to use SSE2 you can disable it by removing the __USE_SSE__ compiler option in the make file.

Note 2: On x86-64 architecture use the __USE_SSE64__ compiler option.

Note 3: It is possible to compile the package for OCTAVE usage. In order to obtain OCTAVE compatible mex files modify the CC variable in the make file (replace mex by mkoctfile, see make file for more information).

 

3. What does the package contain ?

The package contains an optimized C implementation of the SoftDoubleMaxMinOver learning algorithm (functions TrainMaxMinOver, MaxMinOverClassify) that can be used via the MEX interface under OCTAVE or MATLAB. Furthermore a function is available that implements a cross validation scheme for model selection (CVMaxMinOver).

 

4. Usage

The function TrainMaxMinOver allows to train a classifier on a given data set. Given a trained classifier one can use the function MaxMinOverClassify to classify unknown data.

If there aren't known good hyper-parameters for the problem, the CVMaxMinOver function can be used to find good hyper-parameters by cross-validation such that the expected error on unknown data is minimized.

Furthermore, the RankFeaturesMargin function can be used to rank the features according to their importance for the decision function.

Try help functionname under OCTAVE/MATLAB to get detailed information about the parameters of the functions mentioned above.

 

References:

  1. Thomas Martinetz, Kai Labusch, and Daniel Schneegaß. SoftDoubleMaxMinOver: Perceptron-like Training of Support Vector Machines. IEEE Transactions on Neural Networks, 20(7):1061-1072, 2009.
  2.  F. Rosenblatt, The Perceptron: A probabilistic model for information storage and organization in the brain, Psychological Review, vol.65, pp. 386-408, 1958
  3. W. Krauth and M. Mezard, Learning algorithms with optimal stability in neural networks, J.Phys.A, vol. 20, pp. 745-752, 1987
  4. T. Martinetz, MinOver revisited for incremental support-vector-classification, Lecture Notes in Computer Science, Springer Press Heidelberg, 2004
  5. T. Martinetz, MaxMinOver: A simple incremental learning procedure for support vector classification, Proc. of the International Joint Conference on Neural Networks (IEEE Press), 2004
  6. C. Cortes and V. Vapnik, Support-Vector Networks, Mach. Learn., Kluwer Academic Publishers, vol.20, pp. 273-297
  7. B. Schölkopf and A. J. Smola and R. Williamson and P. Bartlett, New support vector algorithms, Neural Computation, vol.12, pp. 1083-1121, 2000
  8. K. Labusch, MaxMinOver: Ein neues iteratives Verfahren zur Supportvektor-Klassifikation mit Anwendungen in der Gesichtserkennung, diploma thesis, 2004