1
votes

I have created a pipeline in Azure data factory (V1). I have a copy pipeline, that has an AzureSqlTable data set on input and AzureBlob data set as output. The AzureSqlTable data set that I use as input, is created as output of another pipeline. In this pipeline I launch a procedure that copies one table entry to blob csv file. I get the following error when launching pipeline:

Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.

How can I solve this?

2
Try creating a custom activity instead. docs.microsoft.com/en-us/azure/data-factory/…Alberto Morillo
@AlbertoMorillo the problem is that with our subscription we have no rights to create a batch service, so custom activity is impossible.KateHamster

2 Answers

1
votes

According to the error information, it indicateds that it is not supported action for Azure data factory, but if use Azure sql table as input and Azure blob data as output it should be supported by Azure data factory.

I also do a demo test it with Azure portal. You also could follow the detail steps to do that.

1.Click the copy data from Azure portal.

enter image description here

2.Set copy properties.

enter image description here

3.Select the source

enter image description here

enter image description here

4.Select the destination data store

enter image description here

enter image description here

enter image description here

enter image description here

5.Complete the deployment

enter image description here

6.Check the result from azure and storage.

enter image description here

enter image description here

Update:

If we want to use the existing dataset we could choose [From Existing Conections], for more information please refer to the screenshot.

enter image description here

Update2:

For Data Factory(v1) copy activity settings it just supports to use existing Azure blob storage/Azure Data Lake Store Dataset. More detail information please refer to this link.

enter image description here

If using Data Factory(V2) is acceptable, we could using existing azure sql dataset.

enter image description here

0
votes

So, actually, if we don't use this awful "Copy data (PREVIEW)" action and we actually add an activity to existing pipeline and not a new pipeline - everything works. So the solution is to add a copy activity manually into an existing pipeline.