I am deploying AWS Elastic Kubernetes Cluster on AWS Cloud. While deploying the cluster from my local machine I am facing a small error, even we can't say exactly it is an error.
So when I am deploying eks cluster using terraform charts from my local machine, it's deploying all the infra requirement on AWS, but when it has to deploy the cluster it is tying to deploy through kubectl, but kubectl is not configured with the newly created cluster, then the terraform throwing an error.
I easily solve this error by binding kubectl with newly created cluster with the below command, but I don't want to do it manually, is there any way in then that I can configure kubectl with the same.
Command - aws eks --region us-west-2 update-kubeconfig --name clustername
FYI - I am using AWS CLI.