0
votes

I am using the puckel airflow image and upgraded the airflow version by editing the dockerfile which includes google and installed apache-airflow-providers-google 2.2.0 as a python dependency and google or gcs as an airflow dependency when I built the container using

`docker build --rm --build-arg AIRFLOW_DEPS="google" --build-arg AIRFLOW_DEPS="google" --build-arg PYTHON_DEPS="httplib2==0.18.1" --build-arg PYTHON_DEPS="apache-airflow-providers-google==2.2.0" -t puckel/docker-airflow .`

I also added the python packages to my requirements.txt file which is listed below but when I go to airflow on my localhost it still says "Broken DAG: [/usr/local/airflow/dags/otherfolder/dag.py] No module named 'google'." How can I solve this?

The following is my dockerfile where I changed the version

# VERSION 1.10.9
# AUTHOR: Matthieu "Puckel_" Roisil
# DESCRIPTION: Basic Airflow container
# BUILD: docker build --rm -t puckel/docker-airflow .
# SOURCE: https://github.com/puckel/docker-airflow

FROM python:3.7-slim-buster
LABEL maintainer="Puckel_"

# Never prompt the user for choices on installation/configuration of packages
ENV DEBIAN_FRONTEND noninteractive
ENV TERM linux

# Airflow
ARG AIRFLOW_VERSION=1.10.14
ARG AIRFLOW_USER_HOME=/usr/local/airflow
ARG AIRFLOW_DEPS=""
ARG PYTHON_DEPS=""
ENV AIRFLOW_HOME=${AIRFLOW_USER_HOME}

# Define en_US.
ENV LANGUAGE en_US.UTF-8
ENV LANG en_US.UTF-8
ENV LC_ALL en_US.UTF-8
ENV LC_CTYPE en_US.UTF-8
ENV LC_MESSAGES en_US.UTF-8

# Disable noisy "Handling signal" log messages:
# ENV GUNICORN_CMD_ARGS --log-level WARNING

RUN set -ex \
    && buildDeps=' \
        freetds-dev \
        libkrb5-dev \
        libsasl2-dev \
        libssl-dev \
        libffi-dev \
        libpq-dev \
        git \
    ' \
    && apt-get update -yqq \
    && apt-get upgrade -yqq \
    && apt-get install -yqq --no-install-recommends \
        $buildDeps \
        freetds-bin \
        build-essential \
        default-libmysqlclient-dev \
        apt-utils \
        curl \
        rsync \
        netcat \
        locales \
    && sed -i 's/^# en_US.UTF-8 UTF-8$/en_US.UTF-8 UTF-8/g' /etc/locale.gen \
    && locale-gen \
    && update-locale LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 \
    && useradd -ms /bin/bash -d ${AIRFLOW_USER_HOME} airflow \
    && pip install -U pip setuptools wheel \
    && pip install pytz \
    && pip install pyOpenSSL \
    && pip install ndg-httpsclient \
    && pip install pyasn1 \
    && pip install apache-airflow[crypto,celery,postgres,hive,jdbc,mysql,ssh${AIRFLOW_DEPS:+,}${AIRFLOW_DEPS}]==${AIRFLOW_VERSION} \
    && pip install 'redis==3.2' \
    && if [ -n "${PYTHON_DEPS}" ]; then pip install ${PYTHON_DEPS}; fi \
    && apt-get purge --auto-remove -yqq $buildDeps \
    && apt-get autoremove -yqq --purge \
    && apt-get clean \
    && rm -rf \
        /var/lib/apt/lists/* \
        /tmp/* \
        /var/tmp/* \
        /usr/share/man \
        /usr/share/doc \
        /usr/share/doc-base

COPY script/entrypoint.sh /entrypoint.sh
COPY config/airflow.cfg ${AIRFLOW_USER_HOME}/airflow.cfg
COPY requirements.txt /requirements.txt

RUN chown -R airflow: ${AIRFLOW_USER_HOME}

EXPOSE 8080 5555 8793

USER airflow
WORKDIR ${AIRFLOW_USER_HOME}
ENTRYPOINT ["/entrypoint.sh"]
CMD ["webserver"]

The following is my docker compose file:

version: '3.7'
services:
    postgres:
        image: postgres:9.6
        environment:
            - POSTGRES_USER=airflow
            - POSTGRES_PASSWORD=airflow
            - POSTGRES_DB=airflow
        logging:
            options:
                max-size: 10m
                max-file: "3"

    webserver:
        image: puckel/docker-airflow:1.10.9
        restart: always
        depends_on:
            - postgres
        environment:
            - LOAD_EX=n
            - EXECUTOR=Local
        logging:
            options:
                max-size: 10m
                max-file: "3"
        volumes:
            - ./dags:/usr/local/airflow/dags
            - ./requirements.txt:/requirements.txt
            # - ./plugins:/usr/local/airflow/plugins
        ports:
            - "8080:8080"
        command: webserver
        healthcheck:
            test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
            interval: 30s
            timeout: 30s
            retries: 3
    python:
      image: python:rc-buster
  

the following is my requirements.txt file

httplib2==0.18.1
google==3.0.0
pandas==1.2.4
apache-airflow-providers-google==2.2.0

The following is the relevant code from the dag:

import datetime
import logging
import dateutil
import pandas as pd
import numpy as np
from airflow import models, utils
from airflow.operators.python_operator import PythonOperator
from airflow.contrib.operators.bigquery_to_gcs import BigQueryToCloudStorageOperator
from airflow.operators.python_operator import BranchPythonOperator
from airflow.contrib.hooks.bigquery_hook import BigQueryHook
from airflow.contrib.operators.gcs_to_bq import GoogleCloudStorageToBigQueryOperator
from airflow.operators.dummy_operator import DummyOperator

from airflow.hooks.http_hook import HttpHook
from pandas.io.json import json_normalize
from pandas.errors import EmptyDataError
from pathlib import Path
from lz_warnings import *
from utils import task_fail_slack_alert
import warnings
1
Please provide a minimal reproducible example of your code and the exact error message. stackoverflow.com/help/how-to-ask - SergiyKolesnikov
I updated it thank you! Also, I included the full code because I am not sure where I might be missing something. - Erika_Marsha
but where is your DAG code? The error is in the DAG. - SergiyKolesnikov
Updated again. Airflow documentation shows that this operator is part of the gcs extra package "pip install 'apache-airflow[gcp]" so I've tried adding it as a dependency at build. I've tried pip install but I'm not exactly sure where in the directory to do it - from the airflow bash? - Erika_Marsha

1 Answers

0
votes

To use GCP related features you have to install the google extras. For this, set in your Dockerfile:

ARG AIRFLOW_DEPS="google"

Then rebuild the Docker image and restart docker-compose.