0
votes

I have created kops cluster and getting below error when logging to the cluster.

Error log :

*****INFO! KUBECONFIG env var set to /home/user/scripts/kube/kubeconfig.yaml
INFO! Testing kubectl connection....
error: You must be logged in to the server (Unauthorized)
ERROR! Test Failed, AWS role might not be recongized by cluster*****

Using script for iam-authentication and logged in to server with proper role before connecting. I am able to login to other server which is in the same environment. tried with diff k8s version and diff configuration.

KUBECONFIG doesn't have any problem and same entry and token details like other cluster. I can see the token with 'aws-iam-authenticator' command

Went through most of the articles and didn't helped

2
Did you read this answer ?mario

2 Answers

2
votes

It seems as a AWS authorization issue. At cluster creation only the IAM user who created the cluster has admin rights on it, so you may need to add your own IAM User first.

1- Start by verifying the IAM user identity used implicitly in all commands: aws sts get-caller-identity

If your aws-cli is set correctly you will have an output similar to this:

{
    "UserId": "ABCDEFGHIJK",
    "Account": "12344455555",
    "Arn": "arn:aws:iam::1234577777:user/Toto"
}

we will refer to the value in Account as YOUR_AWS_ACCOUNT_ID in step 3. (in this example YOUR_AWS_ACCOUNT_ID="12344455555"

2- Once you have this identity you have to add it to AWS role binding to get EKS permissions.

3- You will need to edit the ConfigMap file used by kubectl to add your user kubectl edit -n kube-system configmap/aws-auth In the editor opened, create a username you want to use to refer to yourself using the cluster YOUR_USER_NAME (for simplicity you may use the same as your aws user name, example Toto in step 2) , you will need it in step 4, and use the aws account id (don't forget to keep the quotes ""),you found it in your identity info at step 1 YOUR_AWS_ACCOUNT_ID, as follows in sections mapUsers and mapAccounts.

  mapUsers: |
    - userarn: arn:aws:iam::111122223333:user/ops-user
      username: YOUR_USER_NAME
      groups:
        - system:masters
  mapAccounts: |
    - "YOUR_AWS_ACCOUNT_ID"

4- Finally you need to create a role binding on the kubernetes cluster for the user specified in the ConfigMap

kubectl create clusterrolebinding cluster-admin-binding \
    --clusterrole cluster-admin \
    --user YOUR_USER_NAME
1
votes

with kops vs1.19 you need to add --admin or --user to update your kubernetes cluster and each time you log out of your server you have to export the cluster name and the storage bucket and then update the cluster again. this will work.