0
votes

I am trying to load multiple files from azure blob to azure sql dw by using azure data factory.Below is my code.And I am facing the highlighted error.Could anyone suggest. I am pasting my adf code json here.

I am getting below Highlighted at copy activity stage.

{
    "name": "DelimitedText11",
    "properties": {
        "linkedServiceName": {
            "referenceName": "AzureBlobStorage2",
            "type": "LinkedServiceReference"
        },
        "parameters": {
            "FileName": {
                "type": "string"
            }
        },
        "annotations": [],
        "type": "DelimitedText",
        "typeProperties": {
            "location": {
                "type": "AzureBlobStorageLocation",
                "fileName": {
                    "value": "@dataset().FileName",
                    "type": "Expression"
                },
                "container": "input"
            },
            "columnDelimiter": ",",
            "escapeChar": "",
            "firstRowAsHeader": true,
            "quoteChar": ""
        },
        "schema": []
    },
    "type": "Microsoft.DataFactory/factories/datasets"
}

Error:

{
        "errorCode": "2200",
        "message": "ErrorCode=UserErrorMissingPropertyInPayload,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Required property 'fileName' is missing in payload.,Source=Microsoft.DataTransfer.ClientLibrary,'",
        "failureType": "UserError",
        "target": "Copy data1",
        "details": []
    }
1
Hi Imran, could you post some more information? Data Factory is primarily a visual environment, so some screen shots may be useful. A best guess is that the fileName for the DataSet is not being set properly prior to this activity.Joel Cochran
Hi joel, i wish to add screen shot but community not allowing to paste screenshots of my whole code,if you know other way please tell me i am happy to share screen shotsImran

1 Answers

1
votes

If you want load multiple files from azure blob to Azure SQL Data Warehouse, please must set the Wildcard file path in Source dataset. Or you will always get the error 2200:

{
        "errorCode": "2200",
        "message": "ErrorCode=UserErrorMissingPropertyInPayload,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Required property 'fileName' is missing in payload.,Source=Microsoft.DataTransfer.ClientLibrary,'",
        "failureType": "UserError",
        "target": "Copy data1",
        "details": []
    }

For example, I have two csv files with same schema and load them to my Azure SQL Data Warehouse table test.

Csv files:

enter image description here

Source Dataset:

enter image description here

Source setting: choose all the csv files in source container:

enter image description here.

Sink dataset:

enter image description here

Sink settings:

enter image description here

Mapping: enter image description here

Settings: enter image description here

Execute the pipeline and check the data in the ADW: enter image description here

enter image description here

Hope this helps.