This page provides some miscellaneous tools based on LIBSVM (and LIBLINEAR). Roughly they include
Disclaimer: We do not take any responsibility on damage or other problems caused by using these software and data sets.
Authors: Mu-Chu Lee and Wei-Lin Chiang
Authors: Cheng-Hao Tsai, Chieh-Yen Lin, and Wei-Lin Chiang
Please download the zip file. The usage is the same as LIBLINEAR except a new option "-M." Specify "-M 1" to use one-versus-one multi-class classification. For example:
> train -M 1 datasetAuthors: Hsiang-Fu Yu, Chia-Hua Ho and Yu-Chin Juan
For linear, we extend LIBLINEAR to have methods proposed in the following paper
Ching-Pei Lee and Chih-Jen Lin. Large-scale linear rankSVM. Neural Computation, 26(2014), 781-817.
Please download the zip file. Details of using this code are in the README.ranksvm file. Except the new solver for rankSVM and the new data format supported in this extension, the usage is the same as LIBLINEAR.
For kernel rankSVM, we extend LIBSVM to have the method in
Please download the zip file. Details of using this code are in the README.ranksvm file.
Authors: Ching-Pei Lee and Tzu-Ming Kuo
Because of converting 32-bit int to 64-bit, some warning messages may appear during compilation. Please ignore those warning message.
Author: Yu-Chin Juan
To use, put the code under the compiled liblinear/matlab directory, and open octave or matlab:
> [y,x] = libsvmread('mydata'); > size_acc(y,x);Currently, only linear classification is supported
Examples:
Author: Po-Wei Wang
This code implements methods proposed in the following papers
Please download the zip file. Details of using this code are in the README.cdblock file. Except new parameters for this extension, the usage is the same as LIBLINEAR.
Authors: Hsiang-Fu Yu and Kai-Wei Chang
Author: Ming-Wei Chang, Hsuan-Tien Lin, Ming-Hen Tsai, Chia-Hua Ho and Hsiang-Fu Yu.
The implementation is baed on one method proposed in the paper
Yin-Wen Chang, Cho-Jui Hsieh, Kai-Wei Chang, Michael Ringgaard, and
Chih-Jen Lin.
Low-degree Polynomial Mappings of Data for SVM, 2009.
Please download the zip file here. Details of using this code are in the README.poly2 file. Except new parameters for the degree-2 mapping, the usage is the same as LIBLINEAR.
Authors: Yin-Wen Chang, Cho-Jui Hsieh, Kai-Wei Chang, and Yu-Chin Juan
For some unbalanced data sets, accuracy may not be a good criterion for evaluating a model. This tool enables LIBSVM and LIBLINEAR to conduct cross-validation and prediction with respect to different criteria (F-score, AUC, etc.).
DetailsAuthors: Hsiang-Fu Yu, Chia-Hua Ho, Cheng-Hao Tsai, and Jui-Nan Yen
Assume you have 20,000 images of 200 users:
Author: Ming-Fang Weng
Author: Guo-Xun Yuan
Usage: ./fselect.py training_file [testing_file]Output files: .fscore shows importance of features, .select gives the running log, and .pred gives testing results.
More information about this implementation can be found in Y.-W. Chen and C.-J. Lin, Combining SVMs with various feature selection strategies. To appear in the book "Feature extraction, foundations and applications." 2005. This implementation is still preliminary. More comments are very welcome.
Author: Yi-Wei Chen
The javascript-based toy is available on LIBSVM home page. Three javascript files are included by the HTML page: libsvm-js-interfaces.js, libsvm-js-interfaces-wrappers.js, and svm-toy.js. This HTML example shows how to use these scripts. Note that libsvm-js-interfaces.js is generated by emscripten:
A. Zakai. Emscripten: an llvm-to-javascript compiler. In Proceedings of the ACM international conference companion on object oriented programming systems languages and applications companion. 2011.
If you want to generate libsvm-js-interfaces.js by yourself, in addition to the LIBSVM package, you also need js-interfaces.c and a revised Makefile. You must install emscripten, LLVM, and nodejs on your system because emscripten takes LLVM bytecode genereted from C/C++ and requires the latest version of nodejs. We provide a simple shell script get_emscripten.sh so you can put it in the desired directory and run it for automatic installation. You then must properly set
EMSCRIPTEN_ROOT = '$dir/emscripten/' LLVM_ROOT = '$dir/emscripten-fastcomp/build/Release/bin' NODE_JS = '$dir/node/node'in your ~/.emscripten, where $dir is where your install emscripten, llvm, and nodejs.
Author: Chia-Hua Ho
A simple applet demonstrating SVM classification and regression in 3D. It extends the java svm-toy in the LIBSVM package.
Note: libsvm does support multi-class classification. The code here implements some extensions for experimental purposes.
This code implements multi-class classification and probability estimates using 4 types of error correcting codes. Details of the 4 types of ECCs and the algorithms can be found in the following paper:
T.-K. Huang, R. C. Weng, and C.-J. Lin. Generalized Bradley-Terry Models and Multi-class Probability Estimates. Journal of Machine Learning Research, 7(2006), 85-115. A (very) short version of this paper appears in NIPS 2004.
The code can be downloaded here. The installation is the same as the standard LIBSVM package, and different types of ECCs are specified as the "-i" option. Type "svm-train" without any arguments to see the usage. Note that both "one-againse-one" and "one-against-the rest" multi-class strategies are part of the implementation.
If you specify -b in training and testing, you get probability estimates and the predicted label is the one with the largest value. If you do not specify -b, this is classification based on decision values. Now we use the "exponential-loss" method in the paper:
Allwein et al.: Reducing multiclass to binary: a unifying approach for margin classifiers. Journal of Machine Learning Research, 1:113--141, 2001,
to predict class label. For one-against-the rest
(or called 1vsall), this is the same as the commonly
used way
argmax_{i} (decision value of ith class vs the rest).
For one-against-one, it is different from the
max-win strategy used in libsvm.
MATLAB code for experiments in our paper is available here
Author: Tzu-Kuo Huang
T.-F. Wu, C.-J. Lin, and R. C. Weng. Probability Estimates for Multi-class Classification by Pairwise Coupling. Journal of Machine Learning Research, 2004. A short version appears in NIPS 2003.
After libsvm 2.6, it already includes one of the methods here. You may directly use the standard libsvm unless you are interested in doing comparisons. Please download the tgz file here. The data used in the paper is available here. Please then check README for installation.
Matlab programs for the synthetic data experiment in the paper can be found in this directory. The main program is fig1a.m
Author: Tingfan Wu (svm [at] future.csie.org)
Author: Chih-Chung Chang
This tool which gives the ROC (Receiver Operating Characteristic) curve and AUC (Area Under Curve) by ranking the decision values. Note that we assume labels are +1 and -1. Multi-class is not supported yet.
You can use either MATLAB or Python.
If using MATLAB, you need to
> help plotroc
If using Python, you need to
plotroc.py [-v cv_fold | -T testing_file] [libsvm_options] training_file
> plotroc.py -v 5 -c 10 ../heart_scale
To use LIBLINEAR, you need the following modifications
Authors: Tingfan Wu (svm [at] future.csie.org), Chien-Chih Wang (d98922007 [at] ntu.edu.tw), and Hsiang-Fu Yu
Usage: grid.py [-log2c begin,end,step] [-log2g begin,end,step] [-log2p begin,end,step] [-v fold] [-svmtrain pathname] [-gnuplot pathname] [-out pathname] [-png pathname] [additional parameters for svm-train] dataset
Author: Hsuan-Tien Lin (initial modification); Tzu-Kuo Huang (the parameter epsilon); Wei-Cheng Chang.
Author: Wei-Chun Kao with the help from Leland Wang, Kai-Min Chung, and Tony Sun
Please download the .tgz file here. After making the binary files, type svm-train to see the usage. It includes different methods to implement RSVM.
To speed up the code, you may want to link the code to optimized BLAS/LAPACK or ATLAS.
Author: Kuan-Min Lin
For details of our SVDD formulation and implementation, please see W.-C. Chang, C.-P. Lee, and C.-J. Lin A Revisit to Support Vector Data Description (SVDD). Technical report 2013. Note that you should choose a value C in [1/l, 1], where l is the number of data. Models with C>1 are the same, and so are models with C<1/l.
For details of the square of the radius, please see K.-M. Chung, W.-C. Kao, C.-L. Sun, L.-L. Wang, and C.-J. Lin. Radius Margin Bounds for Support Vector Machines with the RBF Kernel Neural Computation, 15(2003), 2643-2681.
Authors: Leland Wang, Holger Froehlich (University of Tuebingen), Konrad Rieck (Fraunhofer institute), Chen-Tse Tsai, Tse-Ju Lin, Wei-Cheng Chang, Ching-Pei Lee
int nr_class = model->nr_class;in the subtoutine svm_predict() of svm.cpp with this segment of code. Note that this change causes that svm_predict() works only for classification instead of regression or one-class SVM.
This follows from the code used in the paper: C.-W. Hsu and C.-J. Lin. A comparison of methods for multi-class support vector machines , IEEE Transactions on Neural Networks
Author: Chih-Wei Hsu