I'm using terraform v0.14.2 , and I'm trying to create a EKS cluster but I'm having problem when the nodes are joining to the cluster. The status stay stucked in "Creating" until get an error:
My code to deploy is:
Error: error waiting for EKS Node Group (EKS_SmartSteps:EKS_SmartSteps-worker-node-uk) creation: NodeCreationFailure: Instances failed to join the kubernetes cluster. Resource IDs: [i-00c4bac08b3c42225]
resource "aws_eks_node_group" "managed_workers" {
for_each = local.ob
cluster_name = aws_eks_cluster.cluster.name
node_group_name = "${var.cluster_name}-worker-node-${each.value}"
node_role_arn = aws_iam_role.managed_workers.arn
subnet_ids = aws_subnet.private.*.id
scaling_config {
desired_size = 1
max_size = 1
min_size = 1
}
launch_template {
id = aws_launch_template.worker-node[each.value].id
version = aws_launch_template.worker-node[each.value].latest_version
}
depends_on = [
kubernetes_config_map.aws_auth_configmap,
aws_iam_role_policy_attachment.eks-AmazonEKSWorkerNodePolicy,
aws_iam_role_policy_attachment.eks-AmazonEKS_CNI_Policy,
aws_iam_role_policy_attachment.eks-AmazonEC2ContainerRegistryReadOnly,
]
lifecycle {
create_before_destroy = true
ignore_changes = [scaling_config[0].desired_size, scaling_config[0].min_size]
}
}
resource "aws_launch_template" "worker-node" {
for_each = local.ob
image_id = data.aws_ssm_parameter.cluster.value
name = "${var.cluster_name}-worker-node-${each.value}"
instance_type = "t3.medium"
block_device_mappings {
device_name = "/dev/xvda"
ebs {
volume_size = 20
volume_type = "gp2"
}
}
tag_specifications {
resource_type = "instance"
tags = {
"Instance Name" = "${var.cluster_name}-node-${each.value}"
Name = "${var.cluster_name}-node-${each.value}"
}
}
}
In fact, I see in the EC2 instances and EKS the nodes attached to the EKS cluster, but with this status error:
"Instances failed to join the kubernetes cluster"
I cant inspect where is the error because the error messages dont show more info..
Any idea?
thx