问题
I'm currently using the Scikit-learn's KPCA to perform dimensionality reduction on my dataset. They have the isotropic Gaussian kernel (RBF kernel) which only has one value gamma. But now, I want to implement an anisotropic Gaussian kernel that has many values of gamma that depend on the number of dimensions.
I'm aware that Kernel PCA has an option for precomputed kernel but I couldn't find any code example of it being used for dimensionality reduction.
Does anyone know how to implement a custom kernel in sklearn KPCA?
回答1:
I've found the solution to this problem.
First of all, you have to define your own kernel function that returns the gram matrix between the samples.
def customkernel(X1,X2,etc):
k = yourkernelfunction(X1,X2,etc)
return k
If we want to fit a dataset x with size n x m into our KernelPCA model and transform it into n x n_princomp, what we need is
KPCA = kpca(n_princomp,kernel='precomputed')
gram_mat = customkernel(x,x)
transformed_x = KPCA.fit_transform(gram_mat)
Next, if we want to transform another dataset X with size N x m into N x n_princomp what we have to do is calculating a new gram matrix with X as X1 and x as X2.
new_gram_mat = customkernel(X,x)
transformed_X = KPCA.transform(new_gram_mat)
来源:https://stackoverflow.com/questions/58197672/scikit-learns-kernel-pca-how-to-implement-an-anisotropic-gaussian-kernel-or-an