So I've been struggling with the fact that I'm unable to expose any deployment in my eks cluster.
I got down to this:
- My LoadBalancer service public IP never responds
- Went to the load balancer section in my aws console
- Load balancer is no working because my cluster node is not passing the heath checks
- SSHd to my cluster node and found out that containers do not have ports associated to them:
This makes the cluster node fail the health checks, so no traffic is forwarded that way.
I tried running a simple nginx container manually, without kubectl directly in my cluster node:
docker run -p 80:80 nginx
and pasting the node public IP in my browser. No luck:
then I tried curling to the nginx container directly from the cluster node via ssh:
curl localhost
And I'm getting this response: "curl: (7) Failed to connect to localhost port 80: Connection refused"
- Why are containers in the cluster node not showing ports?
- How can I make the cluster node pass the load balancer health checks?
- Could it have something to do with the fact that I created a single node cluster with eksctl?
- What other options do I have to easily run a kubernetes cluster in AWS?