18
votes

How to SSH into a Kubernetes Node or Server hosted on AWS? I have hosted a Kubernetes Server and Node on AWS. I'm able to see the nodes and server from my local laptop with the kubectl get node command.

I need to create a persistent volume for my node but I'm unable to ssh into it.

Is there any specific way to ssh into the node or server?

4
How do you normally SSH into your AWS VM? How did you create the Kubernetes node in the first place? - iamnat
This is not related to Google Kubernetes Engine rather it is related to AWS. I have removed GKE tag from this question. - Taher
ssh core@NODE_IP_ADDRESS -i ~/.ssh/CREDS_FILE_FOR_ACCOUNT - Persistent Plants

4 Answers

7
votes

Try this: ssh -i <path of the private key file> admin@<ip of the aws kube instances>

The perm file should be in $HOME/.ssh/kube_rsa

1
votes

Use kubectl ssh node NODE_NAME

This kubectl addon is from here. https://github.com/luksa/kubectl-plugins. And I have verified that. This works similar to oc command in openshift.

0
votes

Kubernetes nodes can be accessed similar way how we ssh into other linux machines. Just try ssh with the external ip of that node and you can login into it that way.

0
votes

If worker nodes are in private subnet, you can use bastion host with ssh agent forwarding as defined in https://aws.amazon.com/blogs/security/securely-connect-to-linux-instances-running-in-a-private-amazon-vpc/