57
votes

Here's the result when I type docker ps : here is the image when i type docker ps

I have 3 docker containers: webapps, redis and rabbitmq. I want to link container webapps to container redis and rabbitmq container. In non docker apps, mywebapps can send message to rabbitmq and write/read redis.

I tried using command like this

docker run --name rabbitmq -p 8080:80 --link webapps:nimmis/apache-php7 -d rabbitmq

but it does not work.

Here is my config.php on webapps where I am trying to send messages via rabbitmq:

define('HOST', 'localhost');
define('PORT', 5672);

I tried to change localhost with hostname

define('HOST', 'rabbitmq');
define('PORT', 5672);

Error message says connection refused.

It seems that in my three containers needs to be configured in the same network namespace.

6
I am also kind of new to Docker but I am almost certain that using -p along with --link is nothing but useless. -p 8080:80 will tell the docker proxy to make the port 8080 in your container exposed to the port 80 on your system. So if you type -p 6379:6379 as well for your redis container, your php app can use redis just by connecting to localhost:6379.vdegenne

6 Answers

128
votes

Linking is a legacy feature. Please use "user defined networks":

sudo docker network create mynetwork

Then rerun your containers using this network:

sudo docker run --name rabbitmq -p 8080:80 -d --network mynetwork rabbitmq

Do the same for other containers that you want connected with each other.

Using "user defined networks", you have an "internal name resolution" at your disposal (somewhat like domain name resolution when visiting websites). You can use the names of the container that you want to refer to, in order to resolve the IP addresses of containers, as long as they are running on the same "user defined network". With this, you can resolve the IP address of the rabbitmq container with its name, within other containers, on the same network.

All containters on the same "user defined network" will have network connectivity. There is no need for "legacy linking".

23
votes

For inter-container dependencies and links, you'll want to use docker-compose where you can define the links between containers.

In your root directory where you store your Docker files, just make a new file called docker-compose.yml and here you can define your containers as services which rely on each other like this:

version: '2'
services:
  webapps:
    build: .
    links:
      - "rabbitmq:rabmq"
      - "redis"

  rabbitmq:
    image: rabbitmq

  redis:
    image: redis

so here in the definition of the webapps service, you see it links the other two services rabbitmq and redis. What this means is that when the webapps container is build, an entry to it's hosts file is made such that the domain name redis is translated to the IP and port number of the actual container.

You have the option to change the name of how this container is address by using the service:alias notation, like how I defined the rabbitmq to use the alias rabmq inside the container webapps.

To now build and start your containers using docker-compose just type:

docker-compose up -d

So connecting to another container is as simple as using this alias as the name of the host.

Since you are using docker-compose in this case, it creates a docker network automatically to connect all the containers so you shouldn't have to worry about that. But for more information have a look at the docs: https://docs.docker.com/compose/networking/#/specifying-custom-networks

9
votes

You need to link rabbitmq and redis to your webapps container and not the other way arround.


    #run redis container
    docker run --name some-redis -d redis

    #run rabbitmq container
    docker run -d --hostname my-rabbit --name some-rabbit rabbitmq

    #run webapps container
    docker run --name webapps -p 8080:80 --link some-redis:redis --link some-rabbit:rabbitmq nimmis/apache-php7

First run redis and rabbitmq containers. Then run webapps container with links to the 2 containers.

Now, to configure redis host in the webapps - its easy. You can simply use env variable 'REDIS_PORT_6379_TCP_ADDR'. Because once a container is linked you get its env variables. and redis exports that variable.

Regarding the rabbitmq host - you can get the ip after the rabbit container is up by:


    RABBITMQ_IP=$(docker inspect --format '{{ .NetworkSettings.IPAddress }}' some-rabbit)

4
votes

In my experience working with declaratives such as docker-compose.yml is okay, but simply you can use

docker run -d -P  -link  nimmis/apache-php7  rabbitmq redis 
2
votes

You can define your services to use a user-defined network in your docker-compose.yml

version: "3"

services:

  webapps:
    image: nimmis/apache-php7
    ports:
      - "80:8080"
    networks:
      - my-network

  rabbitmq:
    image: rabbitmq
    networks:
      - my-network

  redis:
    image: redis
    networks:
      - my-network

networks:
  my-network:
    driver: overlay

Then do:

docker swarm init
docker stack deploy -c docker-compose.yml my-stack

Check out the full example at https://docs.docker.com/get-started/part3/

0
votes

You could access the IP Address of your Redis Container.

Start rabbitmq and get the internal IP Adress:

docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' rabbitmq > .rabbitmq.ip 

Now, you can add an Apache configuration and add the internal IP Address for rabbitmq while starting the webapps container. Or simply add an entry in the Apache container's /etc/hosts like:

// the dynamic internal IP of rabbitmq is known once rabbitmq starts:
172.30.20.10 rabbitmq.redis.local