0
votes

i'm trying to upload files to Azure Data Lake from Google Storage using Python. Following this url : https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-data-operations-python#create-filesystem-client but the issue is that the files are in Google Storage and not local , so i try to use instead of the local file path the public URL for the file in GCS that looks like "https://storage.googleapis.com//<****filename******>.csv" , But i get the error that it couldn't open the file!!

enter image description here

Anyone did this before , is this the right way to do this or is there other way

Regards

1

1 Answers

0
votes

I assume that the API you are using requires the lpath to refer to a local path and not another cloud path. So I see two options:

  1. Download the file from Google storage to your local location and then upload it from there using the ADLS API.
  2. Alternatively, use something like Azure Data Factory to orchestrate the data movement without having to do a local copy.