libsvm

Import trained SVM from scikit-learn to OpenCV

对着背影说爱祢 提交于 2020-01-02 04:58:05
问题 I'm porting an algorithm that uses a Support Vector Machine from Python (using scikit-learn) to C++ (using the machine learning library of OpenCV). I have access to the trained SVM in Python, and I can import SVM model parameters from an XML file into OpenCV. Since the SVM implementation of both scikit-learn and OpenCV is based on LibSVM, I think it should be possible to use the parameters of the trained scikit SVM in OpenCV. The example below shows an XML file which can be used to initialize

libsvm Shrinking Heuristics

痞子三分冷 提交于 2020-01-01 01:53:09
问题 I'm using libsvm in C-SVC mode with a polynomial kernel of degree 2 and I'm required to train multiple SVMs. During training, I am getting either one or even both of these warnings for some of the SVMs that I train: WARNING: using -h 0 may be faster * WARNING: reaching max number of iterations optimization finished, #iter = 10000000 I've found the description for the h parameter: -h shrinking : whether to use the shrinking heuristics, 0 or 1 (default 1) and I've tried to read the explanation

libsvm java implementation

不打扰是莪最后的温柔 提交于 2019-12-31 08:48:14
问题 I am trying to use the java bindings for libsvm: http://www.csie.ntu.edu.tw/~cjlin/libsvm/ I have implemented a 'trivial' example which is easily linearly separable in y. The data is defined as: double[][] train = new double[1000][]; double[][] test = new double[10][]; for (int i = 0; i < train.length; i++){ if (i+1 > (train.length/2)){ // 50% positive double[] vals = {1,0,i+i}; train[i] = vals; } else { double[] vals = {0,0,i-i-i-2}; // 50% negative train[i] = vals; } } Where the first

Windows 7 64bit libsvm and python error: function 'svm_get_sv_indices' not foud

ε祈祈猫儿з 提交于 2019-12-31 05:49:06
问题 i'm working on Windows 7 (64bit).I've installed Python 2.7.3 (32bit version) and libsvm-3.13. When I try to launch a simple .py file that import svmutil I get an error C:\libsvm-3.13\python>python provade.py Traceback (most recent call last): File "provade.py", line 1, in <module> from svmutil import * File "C:\libsvm-3.13\python\svmutil.py", line 3, in <module> from svm import * File "C:\libsvm-3.13\python\svm.py", line 288, in <module> fillprototype(libsvm.svm_get_sv_indices, None, [POINTER

Matlab libsvm - how to find the w coefficients

ぃ、小莉子 提交于 2019-12-29 17:47:30
问题 How can find what the vector w is, i.e. the perpendicular to the separation plane? 回答1: This is how I did it here. If I remember correctly, this is based on how the dual form of the SVM optimisation works out. model = svmtrain(...); w = (model.sv_coef' * full(model.SVs)); And the bias is (and I don't really remember why its negative): bias = -model.rho; Then to do the classification (for a linear SVM), for a N-by-M dataset 'features' with N instances and M features, predictions = sign

predict.svm does not predict new data

对着背影说爱祢 提交于 2019-12-29 06:59:16
问题 unfortunately I have problems using predict() in the following simple example: library(e1071) x <- c(1:10) y <- c(0,0,0,0,1,0,1,1,1,1) test <- c(11:15) mod <- svm(y ~ x, kernel = "linear", gamma = 1, cost = 2, type="C-classification") predict(mod, newdata = test) The result is as follows: > predict(mod, newdata = test) 1 2 3 4 <NA> <NA> <NA> <NA> <NA> <NA> 0 0 0 0 0 1 1 1 1 1 Can anybody explain why predict() only gives the fitted values of the training sample (x,y) and does not care about

predict.svm does not predict new data

与世无争的帅哥 提交于 2019-12-29 06:58:07
问题 unfortunately I have problems using predict() in the following simple example: library(e1071) x <- c(1:10) y <- c(0,0,0,0,1,0,1,1,1,1) test <- c(11:15) mod <- svm(y ~ x, kernel = "linear", gamma = 1, cost = 2, type="C-classification") predict(mod, newdata = test) The result is as follows: > predict(mod, newdata = test) 1 2 3 4 <NA> <NA> <NA> <NA> <NA> <NA> 0 0 0 0 0 1 1 1 1 1 Can anybody explain why predict() only gives the fitted values of the training sample (x,y) and does not care about

Multi-class classification in libsvm [closed]

試著忘記壹切 提交于 2019-12-27 17:06:09
问题 It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 7 years ago . I'm working with libsvm and I must implement the classification for multiclasses with one versus all . How can I do it? Does libsvm version 2011 use this? I think that my question is not very clear. if libsvm don

Libsvm : Wrong input format at line 1

混江龙づ霸主 提交于 2019-12-25 08:27:14
问题 I am trying to use Libsvm and I got the following behaviour: root@bcfd88c873fa:/home/libsvm# ./svm-train myfile Wrong input format at line 1 root@bcfd88c873fa:/home/libsvm# head -n 5 myfile 2 0:0.00000 8:0.00193 2:0.00000 1:0.00000 10:0.00722 3 6:0.00235 2:0.00000 0:0.00000 1:0.00000 5:0.00155 4 0:0.00000 1:0.00000 2:0.00000 4:0.00187 3 6:0.00121 8:0.00211 1:0.00000 2:0.00000 0:0.00000 3 0:0.00000 2:0.00000 1:0.00000 Can you see anything wrong on the format ? It works with other svm

Extract coefficients/weights from libsvm model file

◇◆丶佛笑我妖孽 提交于 2019-12-25 05:04:29
问题 I am using libsvm for creating a 2-classes classifier. I wish to extract the coefficient/weight of each feature used by the model generated by ./svm-train training.training model.model The model.model file looks like: svm_type c_svc kernel_type rbf gamma 8 nr_class 2 total_sv 442 rho 21 label 1 -1 nr_sv 188 254 SV 7080.357768871263 0:0 1:0.00643 2:0.01046 3:0.00963 4:0.02777 5:0.04338 19:0.04468 528.7111702760092 0:0 1:0.00058 3:0.00086 6:0.01158 7:0.0028 9:0.08991 13:0.0096 ... 391