I can use terraform
to deploy a Kubernetes
cluster in GKE
.
Then I have set up the provider for Kubernetes
as follows:
provider "kubernetes" {
host = "${data.google_container_cluster.primary.endpoint}"
client_certificate = "${base64decode(data.google_container_cluster.primary.master_auth.0.client_certificate)}"
client_key = "${base64decode(data.google_container_cluster.primary.master_auth.0.client_key)}"
cluster_ca_certificate = "${base64decode(data.google_container_cluster.primary.master_auth.0.cluster_ca_certificate)}"
}
By default, terraform
interacts with Kubernetes
with the user client
, which has no power to create (for example) deployments. So I get this error when I try to apply my changes with terraform
:
Error: Error applying plan:
1 error(s) occurred:
* kubernetes_deployment.foo: 1 error(s) occurred:
* kubernetes_deployment.foo: Failed to create deployment: deployments.apps is forbidden: User "client" cannot create deployments.apps in the namespace "default"
I don't know how should I proceed now, how should I give this permissions to the client
user?
If the following fields are added to the provider, I am able to perform deployments, although after reading the documentation it seems these credentials are used for HTTP
communication with the cluster, which is insecure if it is done through the internet.
username = "${data.google_container_cluster.primary.master_auth.0.username}"
password = "${data.google_container_cluster.primary.master_auth.0.password}"
Is there any other better way of doing so?
- you can use the service account that are running the terraform
data "google_client_config" "default" {}
provider "kubernetes" {
host = "${google_container_cluster.default.endpoint}"
token = "${data.google_client_config.default.access_token}"
cluster_ca_certificate = "${base64decode(google_container_cluster.default.master_auth.0.cluster_ca_certificate)}"
load_config_file = false
}
OR
- give permissions to the default "client"
- But you need a valid authentication on GKE cluster provider to run this :/ ups circular dependency here
resource "kubernetes_cluster_role_binding" "default" {
metadata {
name = "client-certificate-cluster-admin"
}
role_ref {
api_group = "rbac.authorization.k8s.io"
kind = "ClusterRole"
name = "cluster-admin"
}
subject {
kind = "User"
name = "client"
api_group = "rbac.authorization.k8s.io"
}
subject {
kind = "ServiceAccount"
name = "default"
namespace = "kube-system"
}
subject {
kind = "Group"
name = "system:masters"
api_group = "rbac.authorization.k8s.io"
}
}
It looks like the user that you are using is missing the required RBAC role for creating deployments. Make sure that user has the correct verbs for the deployments resource. You can take a look at this Role examples to have an idea about it.
You need to provide both. Check this example on how to integrate the Kubernetes provider with the Google Provider.
Example of how to configure the Kubernetes provider:
provider "kubernetes" {
host = "${var.host}"
username = "${var.username}"
password = "${var.password}"
client_certificate = "${base64decode(var.client_certificate)}"
client_key = "${base64decode(var.client_key)}"
cluster_ca_certificate = "${base64decode(var.cluster_ca_certificate)}"
}
来源:https://stackoverflow.com/questions/54364515/managing-gke-and-its-deployments-with-terraform