0
votes

Versions:

Terraform==.12

docker==19.03.8

python==3.8

postgreSQL==9.6

apache-airflow==1.10.10

Description:

The goal is to create a container for apache airflow local development. I'm using a Postgres database as the backend to store metadata for airflow since it's required to use the Airflow LocalExecutor. Lastly, I'm using Terraform to create Airflow and Postgres containers.

Problem:

The error is raised within the dev_airflow container's entrypoint.sh. Specifically airflow initdb raised an error related to the sql_alchemy_conn string. The sql_alchemy_conn string is exported as an env variable before running airflow initdb. The sql_alchemy_conn's values are valid and represent a live postgres database.

I've tried these sql_alchemy_conn strings:

sql_alchemy_conn="postgresql://postgres_user:password@postgres:5432/postgres_db"
sql_alchemy_conn="postgresql+psycopg2://postgres_user:password@postgres:5432/postgres_db"

which resulted in the error:

sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not translate host name "postgres" to address: Name or service not known

I've tried these sql_alchemy_conn strings:

sql_alchemy_conn="postgresql://postgres_user:password@localhost:5432/postgres_db"
sql_alchemy_conn="postgresql+psycopg2://postgres_user:password@localhost:5432/postgres_db"

which resulted in the error:

sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Connection refused
        Is the server running on host "localhost" (127.0.0.1) and accepting
        TCP/IP connections on port 5432?
could not connect to server: Cannot assign requested address
        Is the server running on host "localhost" (::1) and accepting
        TCP/IP connections on port 5432?

Terraform main.tf used to compose docker containers:

provider "docker"{}

resource "docker_image" "postgres" {
  name     = "postgres:9.6"
}

resource "docker_container" "postgres" {
  image = docker_image.postgres.name
  name  = "postgres"

  env = [
    format("POSTGRES_USER=%s", var.postgres_user),
    format("POSTGRES_PASSWORD=%s", var.postgres_password),
    format("POSTGRES_DB=%s", var.postgres_db),
  ]

  ports {
    internal = var.postgres_port
  }
}

resource "docker_image" "local_airflow" {
  name         = "local_airflow:latest"
  keep_locally = false
}

resource "docker_container" "airflow" {
  image = docker_image.local_airflow.name
  name  = "dev_airflow"
  ports {
    internal = 8080
    external = 8080
  }

  env = [
    format("AIRFLOW__CORE__EXECUTOR=%s", lookup(var.executor, var.environment)), 
    format("AIRFLOW__CORE__LOAD_EXAMPLES=%s", var.load_examples),
    format("AIRFLOW__CORE__SQL_ALCHEMY_CONN=%s", var.sql_alchemy_conn)
  ]

  volumes {
      host_path = "/airflow/dags"
      container_path = "/usr/local/airflow/dags"
  }
  volumes {
    host_path = "/airflow/plugins"
    container_path = "/usr/local/airflow/plugins"
  }
  depends_on = [docker_container.postgres]

  command = ["webserver"]

  entrypoint = ["/entrypoint.sh"]

  healthcheck {
    test = ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
    interval = "30s"
    timeout = "30s"
    retries = 3
  }

  restart = "always"
}

1
Have you tried using the sql_alchemy_conn value as a base64 encoded representation?pequetrefe

1 Answers

0
votes

Maybe this resources are deployed on default bridge network.

Try to create a custom network

https://registry.terraform.io/providers/kreuzwerker/docker/latest/docs/resources/network

and then, with this output, run this container in network https://registry.terraform.io/providers/kreuzwerker/docker/latest/docs/resources/container networks_advanced - (Optional, block) See Networks Advanced below for details. If this block has priority to the deprecated network_alias and network properties.

It's the same that you use with docker CLI.

docker network create foo
docker run --rm --network=foo bar