i\'m getting an error when running kubectl one one machine (windows)
the k8s cluster is running on CentOs 7 kubernetes cluster 1.7 master, worker
Here\'s my
I got the same error while running $ kubectl get nodes
as a root user. I fixed it by exporting kubelet.conf
to environment variable.
$ export KUBECONFIG=/etc/kubernetes/kubelet.conf
$ kubectl get nodes
In case of the error you should export all the kubecfg which contains the certs. kops export kubecfg "your cluster-name
and export KOPS_STATE_STORE=s3://"paste your S3 store"
.
Now you should be able to access and see the resources of your cluster.
I my case I resolved this issue copying the kubelet configuration to my home kube config
cat /etc/kubernetes/kubelet.conf > ~/.kube/config
On GCP
check: gcloud version
-- localMacOS# gcloud version
Run: --- localMacOS# gcloud container clusters get-credentials 'clusterName' \ --zone=us-'zoneName'
Get clusterName and zoneName from your console -- here: https://console.cloud.google.com/kubernetes/list?
ref: .x509 @market place deployments on GCP #Kubernetes
This was happening because my company's network does not allow self signing certificates through their network. Try switching to a different network