5
votes

I have a Gke cluster with one node pool attached

I want to make some changes to the node pool though- like adding tags, etc

So I created a new node pool with my new config and attached to the cluster. so now cluster has 2 node pools.

At this point I want to move the pods to the new node pool and destroy the old one

How is this process done? Am I doing this right?

2

2 Answers

7
votes

There are multiple ways to move your pods to the new node pool.

One way is to steer your pods to the new node pool using a label selector in your pod spec, as described in the "More fun with node pools" in the Google blog post that announced node pools (with the caveat that you need to forcibly terminate the existing pods for them to be rescheduled). This leaves all nodes in your cluster functional, and you can easily shift the pods back and forth between pools using the labels on the node pools (GKE automatically adds the node pool name as a label to make this easier).

Another way is to follow the tutorial for Migrating workloads to different machine types, which describes how to cordon / drain nodes to shift workloads to the new node pool.

Finally, you can just use GKE to delete your old node pool. GKE will automatically drain nodes prior to deleting them, which will cause your workload to shift to the new pool without you needing to run any extra commands yourself.

0
votes

You can use :

kubectl drain <node_name>

in order to move all pods from a specific node to other nodes