One-class nu-Support Vector machine (SVMs) learning technique maps the input data into a much higher dimensional space and then uses a small portion of the training data (support vectors) to parametrize the decision surface that can linearly separate nu fraction of training points (labeled as anomalies) from the rest.
The exact solution of standard one-class nu SVMs assigns (at least) nu fraction of training points as support vectors. However some of these support vectors may be unnecessary or redundant. Hence the computational issue turns alarming especially when SVMs based novelty detectors with nonlinear kernels are trained on data sets of huge size.
The proposed nu-Anomica algorithm can solve this problem. The idea is to train the machine such that it can provide a close approximation to the exact decision plane using far less number of training points and without loosing much of the generalization performance of the classical approach. The developed procedure closely preserves the accuracy of standard One-class nu-SVMs while reducing both training time and test time by several factors.