问题
I was developing a new algorithm that generates a modified kernel matrix for training with a SVM and encountered a strange problem.
For testing purposes I was comparing the SVM models learned using kernelMatrix interface and normal kernel interface. For example,
# Model with kernelMatrix computation within ksvm
svp1 <- ksvm(x, y, type="C-svc", kernel=vanilladot(), scaled=F)
# Model with kernelMatrix computed outside ksvm
K <- kernelMatrix(vanilladot(), x)
svp2 <- ksvm(K, y, type="C-svc")
identical(nSV(svp1), nSV(svp2))
Note that I have turned scaling off, as I am not sure how to perform scaling on kernel matrix.
From my understanding both svp1
and svp2
should return the same model. However I observed that this not true for a few datasets, for example glass0
from KEEL.
What am I missing here?
回答1:
I think this has to do with same issue posted here. kernlab appears to treat the calculation of ksvm differently when explicitly using vanilladot() because it's class is 'vanillakernel' instead of 'kernel'.
if you define your own vanilladot kernel with a class of 'kernel' instead of 'vanillakernel' the code will be equivalent for both:
kfunction.k <- function(){
k <- function (x,y){crossprod(x,y)}
class(k) <- "kernel"
k}
l<-0.1 ; C<-1/(2*l)
svp1 <- ksvm(x, y, type="C-svc", kernel=kfunction.k(), scaled=F)
K <- kernelMatrix(kfunction.k(),x)
svp2 <- ksvm(K, y, type="C-svc", kernel='matrix', scaled=F)
identical(nSV(svp1), nSV(svp2))
It's worth noting that svp1 and svp2 are both different from their values in the original code because of this change.
来源:https://stackoverflow.com/questions/27525011/kernel-matrix-computation-outside-svm-training-in-kernlab