It seems as a AWS authorization issue. At cluster creation only the IAM user who created the cluster has admin rights on it, so you may need to add your own IAM User first.
1- Start by verifying the IAM user identity used implicitly in all commands: aws sts get-caller-identity
If your aws-cli is set correctly you will have an output similar to this:
{
"UserId": "ABCDEFGHIJK",
"Account": "12344455555",
"Arn": "arn:aws:iam::1234577777:user/Toto"
}
we will refer to the value in Account
as YOUR_AWS_ACCOUNT_ID
in step 3. (in this example YOUR_AWS_ACCOUNT_ID="12344455555"
2- Once you have this identity you have to add it to AWS role binding to get EKS permissions.
3- You will need to edit the ConfigMap file used by kubectl to add your user kubectl edit -n kube-system configmap/aws-auth
In the editor opened, create a username you want to use to refer to yourself using the cluster YOUR_USER_NAME
(for simplicity you may use the same as your aws user name, example Toto
in step 2) , you will need it in step 4, and use the aws account id (don't forget to keep the quotes ""),you found it in your identity info at step 1 YOUR_AWS_ACCOUNT_ID
, as follows in sections mapUsers
and mapAccounts
.
mapUsers: |
- userarn: arn:aws:iam::111122223333:user/ops-user
username: YOUR_USER_NAME
groups:
- system:masters
mapAccounts: |
- "YOUR_AWS_ACCOUNT_ID"
4- Finally you need to create a role binding on the kubernetes cluster for the user specified in the ConfigMap
kubectl create clusterrolebinding cluster-admin-binding \
--clusterrole cluster-admin \
--user YOUR_USER_NAME