6
votes

I have multiple instances of Mongo db deployed inside my kubernetes cluster through helm packages. They are running as a service, in NodePort. How do I connect to those Mongo db instances through UI tools like MongoDB Compass and RoboMongo from outside the cluster? Any help is appreciated.

3

3 Answers

10
votes

You can use kubectl port-forward to connect to MongoDB from outside the cluster.

Run kubectl port-forward << name of a mongodb pod >> --namespace << mongodb namespace >> 27018:27018.
Now point your UI tool to localhost:27018 and kubectl will forward all connections to the pod inside the cluster.

Starting with Kubernetes 1.10+ you can also use this syntax to connect to a service (you don't have to find a pod name first):
kubectl port-forward svc/<< mongodb service name >> 27018:27018 --namespace << mongodb namespace>>

1
votes

If it is not your production database you can expose it through a NodePort service:

# find mongo pod name
kubectl get pods
kubectl expose pod <<pod name>> --type=NodePort
# find new mongo service
kubectl get services

Last command will output something like

mongodb-0   10.0.0.45    <nodes>       27017:32151/TCP   30s

Now you can access your mongo instance with mongo <<node-ip>>:32151

0
votes

If not resolved, expose your mongo workload as a load balancer and use the IP provided by the service. Copy the LB IP and use the same in the robo3T. If it requires authentication, check my YAML file below:

apiVersion: apps/v1
kind: Deployment
metadata:
    name: mongodb
    labels:
        app: mongodb
spec:
    replicas: 1
    selector:
        matchLabels:
            app: mongodb
    template:
        metadata:
            labels:
                app: mongodb
        spec:
            containers:
                - name: mongodb
                  image: mongo
                  volumeMounts:
                      - name: data
                        mountPath: "/data/db"
                        subPath: "mongodb_data"
                  ports:
                      - containerPort: 27017
                        protocol: TCP
                  env:
                      - name: MONGO_INITDB_ROOT_USERNAME
                        value: xxxx
                      - name: MONGO_INITDB_ROOT_PASSWORD
                        value: xxxx
            imagePullSecrets:
                - name: xxxx
            volumes:
                - name: data
                  persistentVolumeClaim:
                      claimName: xxx

Set the same values in the authentication tab in ROBO3T

NOTE: I haven't mentioned the service section in the YAML since I directly exposed as an LB in the GCP UI itself.