2
votes

I am trying to learn using the Azure Data Factory to copy data (a collection of csv files in a folder structure) from an Azure File Share to a Cosmos DB instance.

In Azure Data factory I'm creating a "copy data" activity and try to set my file share as source using the following host:

mystorageaccount.file.core.windows.net\\mystoragefilesharename

When trying to test the connection, I get the following error:

[{"code":9059,"message":"File path 'E:\\approot\\mscissstorage.file.core.windows.net\\mystoragefilesharename' is not supported. Check the configuration to make sure the path is valid."}]

Should I move the data to another storage type like a blob or I am not entering the correct host url?

3

3 Answers

2
votes

You'll need to specify the host in json file like this "\\myserver\share" if you create pipeline with JSON directly or you use set the host url like this "\myserver\share" if you're using UI to setup pipeline.

Here is more info: https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions

1
votes

I believe when you create file linked service, you might choose public IR. If you choose public IR, local path (e.g c:\xxx, D:\xxx) is not allowed, because the machine that run your job is managed by us, which not contains any customer data. Please use self-hosted IR to copy your local files.

1
votes

Based on the link posted by Nicolas Zhang: https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system#sample-linked-service-and-dataset-definitions and the examples provided therein, I was able to solve it an successfully create the copy action. I had two errors (I'm configuring via the data factory UI and not directly the JSON):

  1. In the host path, the correct one should be: \\mystorageaccount.file.core.windows.net\mystoragefilesharename\myfolderpath
  2. The username and password must be the one corresponding to the storage account and not to the actual user's account which I was erroneously using.