0
votes

I am pretty new to both geography data and airflow, so please forgive me and ask for precisions if my question is not clear.

I am trying to run a DAG through airflow (google composer), to read data from a table in a specific dataset, convert a specific column to a GEOGRAPHY type, and dump the result in another table:


PROJECT = os.getenv("GCP_PROJECT")

default_args = {
    "owner": "Airflow",
    "depends_on_past": False,
    "start_date": datetime(2020, 4, 1),
    "email": ["[email protected]"],
    "email_on_failure": False,
    "email_on_retry": False,
    "retries": 5,
    "retry_delay": timedelta(minutes=1),
    # 'queue': 'bash_queue',
    # 'pool': 'backfill',
    # 'priority_weight': 10,
    # 'end_date': datetime(2016, 1, 1),
}



dag = DAG(DAG_ID, default_args=default_args, schedule_interval='@once ', catchup=False)



ingestion_query = f"""
            SELECT 
                id, 
                epsg, 
                SAFE.ST_GEOGFROMTEXT(geometry) as geometry, 
                CURRENT_TIMESTAMP() as ingested_at
            FROM dataset_raw.trees
            WHERE SAFE.ST_GEOGFROMTEXT(geometry) is not NULL
        """
with dag:
    etl_operator = BigQueryOperator(sql=ingestion_query,
                                    destination_dataset_table=f'{PROJECT}.dataset_clean.trees',
                                    write_disposition="WRITE_TRUNCATE",
                                    task_id=f"full_dump_trees")

The ingestion query has been tested and is working from the bigquery console.

However, when running the DAG, it fails with the following error message

INFO - Job 150: Subtask full_dump_trees             SELECT 
INFO - Job 150: Subtask full_dump_trees                 id, 
INFO - Job 150: Subtask full_dump_trees                 epsg, 
INFO - Job 150: Subtask full_dump_trees                 SAFE.ST_GEOGFROMTEXT(geometry) as geometry, 
INFO - Job 150: Subtask full_dump_trees                 CURRENT_TIMESTAMP() as ingested_at
INFO - Job 150: Subtask full_dump_trees             FROM geodata_raw.trees
INFO - Job 150: Subtask full_dump_trees             WHERE SAFE.ST_GEOGFROMTEXT(geometry) is not NULL
[...]
ERROR - BigQuery job failed. Final error was: 
{'reason': 'invalidQuery', 'location': 'query', 
 'message': '5.37 - 5.46: Unrecognized function safe.st_geogfromtext\n[Try using standard SQL (https://cloud.google.com/bigquery/docs/reference/standard-sql/enabling-standard-sql)]'}. The job was: {'kind': 'bigquery#job', 'etag': 'jBrdHGpAuAGJ48Z5A/ALIA==', 'id': 'strange-terra-273917:EU.job_oDUB9c3kz0NE1JKJ5tCaeHmuGN3d', 'selfLink': 'https://bigquery.googleapis.com/bigquery/v2/projects/strange-terra-273917/jobs/job_oDUB9c3kz0NE1JKJ5tCaeHmuGN3d?location=EU', 'user_email': '[email protected]', 'configuration': {'query': {'query': '\n            SELECT \n                id, \n                epsg, \n                SAFE.ST_GEOGFROMTEXT(geometry) as geometry, \n                CURRENT_TIMESTAMP() as ingested_at\n            FROM geodata_raw.trees\n            WHERE SAFE.ST_GEOGFROMTEXT(geometry) is not NULL\n        ', 'destinationTable': {'projectId': 'strange-terra-273917', 'datasetId': 'geodata_clean', 'tableId': 'trees'}, 'createDisposition': 'CREATE_IF_NEEDED', 'writeDisposition': 'WRITE_TRUNCATE', 'priority': 'INTERACTIVE', 'allowLargeResults': False, 'useLegacySql': True}, 'jobType': 'QUERY'}, 'jobReference': {'projectId': 'strange-terra-273917', 'jobId': 'job_oDUB9c3kz0NE1JKJ5tCaeHmuGN3d', 'location': 'EU'}, 'statistics': {'creationTime': '1586798059340', 'startTime': '1586798059356', 'endTime': '1586798059356'}, 'status': {'errorResult': {'reason': 'invalidQuery', 'location': 'query', 'message': '5.37 - 5.46: Unrecognized function safe.st_geogfromtext\n[Try using standard SQL (https://cloud.google.com/bigquery/docs/reference/standard-sql/enabling-standard-sql)]'}, 'errors': [{'reason': 'invalidQuery', 'location': 'query', 'message': '5.37 - 5.46: Unrecognized function safe.st_geogfromtext\n[Try using standard SQL (https://cloud.google.com/bigquery/docs/reference/standard-sql/enabling-standard-sql)]'}], 'state': 'DONE'}}
Traceback (most recent call last)
  File "/usr/local/lib/airflow/airflow/models/taskinstance.py", line 930, in _run_raw_tas
    result = task_copy.execute(context=context
  File "/usr/local/lib/airflow/airflow/contrib/operators/bigquery_operator.py", line 246, in execut
    encryption_configuration=self.encryption_configuratio
  File "/usr/local/lib/airflow/airflow/contrib/hooks/bigquery_hook.py", line 913, in run_quer
    return self.run_with_configuration(configuration
  File "/usr/local/lib/airflow/airflow/contrib/hooks/bigquery_hook.py", line 1344, in run_with_configuratio
    format(job['status']['errorResult'], job)
Exception: BigQuery job failed. Final error was: ...

The error points to a link which seems to be deprecated. The documentation it sens back to seems to tell that geography functions are part of standard SQL so I am at a loss about why this would not work.

Is that a known limitation of Airflow bigquery operators ?

EDIT: As per the documentation, the function ST_GEOGFROMTEXT is part of what google calls the standard SQL for bigquery.

1
the keyword here is "Try using standard SQL" - have you tried? - Mikhail Berlyant
Was going to update my question: the function I want to use is indeed in the standard SQL as per the documentation - LoicM
so you must explicitly state this in your code - either add #standardSQL to the beginning of your query - make sure it is on separate and first row of you sql script Or you can set this within BigQueryOperator - Mikhail Berlyant

1 Answers

4
votes

You must explicitly state that you are using standard sql in your code - either add #standardSQL to the beginning of your query - make sure it is on separate and first row of you sql script
Or
you can set this within BigQueryOperator

use_legacy_sql (bool) – Whether to use legacy SQL (true) or standard SQL (false).