1
votes

I'm trying to create K8s cluster on Amazon EKS with Terraform. All the code is on github: https://github.com/amorfis/aws-eks-terraform

access_key and secret are configured for the user which has the necessary policy, as seen in README.md.

I run terraform init, then terraform apply and it fails with following error: module.eks.null_resource.update_config_map_aws_auth (local-exec): error: unable to recognize "aws_auth_configmap.yaml": Unauthorized

I also checked in the modules, and it looks like it should create 2 files: aws_auth_configmap.yaml and kube_config.yaml, but instead I can see 2 different files created: kubeconfig_eks-cluster-created-with-tf and config-map-aws-auth_eks-cluster-created-with-tf.yaml.

1

1 Answers

3
votes

The problem here seems to be that you try to use an AssumedRole but then the module attempts to do local exec which is why it fails.

What you would be required is something like this where you add "kubeconfig_aws_authenticator_env_variables" to the module taken from the official example like below -

module "my-cluster" {
  source       = "terraform-aws-modules/eks/aws"
  cluster_name = "my-cluster"
  kubeconfig_aws_authenticator_env_variables = {
             AWS_PROFILE = "NameOfProfile"
  }
  subnets      = ["subnet-abcde012", "subnet-bcde012a", "subnet-fghi345a"]
  vpc_id       = "vpc-1234556abcdef"

  worker_groups = [
    {
      instance_type = "m4.large"
      asg_max_size  = 5
    }
  ]

  tags = {
    environment = "test"
  }
}

Note: The following is added -

 kubeconfig_aws_authenticator_env_variables = {
    AWS_PROFILE = "NameOfProfile"
  }

Replace the value of profile with whatever name you have provided with in the ~/.aws/config.