I have installed airflow and I've written a DAG to integrate MySQL data with BigQuery.
When I run the python script, I got the following error:
ImportError: cannot import name GbqConnector
I followed the instruction to downgrade pandas to an older version. When I did so I then got a different error:
ImportError: cannot import name _test_google_api_imports
Edit: the advise from x97Core worked.
I have a different problem now. I am getting the following error:
/usr/local/lib/python2.7/dist-packages/airflow/models.py:1927: PendingDeprecationWarning: Invalid arguments were passed to MySqlToGoogleCloudStorageOperator. Support for passing such arguments will be dropped in Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'google_cloud_storage_connn_id': 'podioGCPConnection'} category=PendingDeprecationWarning
/usr/local/lib/python2.7/dist-packages/airflow/models.py:1927: PendingDeprecationWarning: Invalid arguments were passed to GoogleCloudStorageToBigQueryOperator. Support for passing such arguments will be dropped in Airflow 2.0. Invalid arguments were:
*args: ()
**kwargs: {'project_id': 'podio-data'} category=PendingDeprecationWarning
According to this link, the issue is with airflow's compatibility with python2 and pythn3. I've tried running the code on both but the same error still comes up. Airflow mysql to gcp Dag error
Does anyone know if there is a solution for this?