The proposed core set based LARM is evaluated on twenty datasets, including both LIBSVM
datasets  and UCI datasets .
In table 3 we list the initialized values for all LIBSVM
parameter we chosen.
To evaluate the performance of the MALDTP method, we conduct some experiments on two well-known databases (CK+  and JAFFE ) by using LIBSVM
 with Linear kernel and RBF kernel to classify the facial expressions, where the parameter C for RBF kernel is set to 100.
It is implemented in LIBSVM
 and it is the approach used in this work.
Processing of data set; Initialize the current positions and the pbest positions of all particles which are binary bits with each representing whether the corresponding gene is selected or not; do Determine the mean best position among the particles by mbest = Get_mbest(pbest), select a suitable value for [beta]; for i = 1 to population size M Call the LIBSVM
tool box to construct the SVM classifier and get the classification accuracy for the data; With the classification accuracy and the number of selected genes (i.
For the calculated LBP histogram, LIBSVM
 is used for similarity measurement of text region and finally gives the text extraction results.
One of the most commonly used solvers is LIBSVM
We used the LIBSVM
implementation of SVMs for regression.
By using the software LIBSVM
(Chang & Lin, 2007), we first found the best parameter C and [gamma] with crossvalidation, then used the best parameter C and [gamma] to train the whole training set, and finally tested over the testing set.
Extensions such as parameter optimization, feature selection, enhanced cross-validation (CV) options, the one-versus-all training scheme, and report generation were implemented in a C library on top of LIBSVM
Two classic SVM regression methods, nu-SVR and epsilon-SVR, are provided by LIBSVM
These data are trained on LIBSVM
 with four methods respectively, namely the grid search method, bilinear search method, improved bilinear search method and bilinear grid search method.