0
votes

I have a data factory with input from ADLS Gen2 (only this is compliant in our company). It works fine. The pic given below is the settings of 'copy data' activity. As given in the pic for storing logs (missed rows data), we are forced to use blob storage or gen 1 datalake. How can we use ADLS Gen2 for this? Looks like a bottleneck. We will have complacency issues if such data is stored outside Gen2

enter image description here

1
Why you face this? On my side it is no problem. - Bowman Zhu
Can you show the explanation popup from your side? - Blue Clouds
i.stack.imgur.com/lurf1.png This linked service is linked to ADLS gen2 which is named 0730bowmanwindow. - Bowman Zhu
You seems faced the similar problem: stackoverflow.com/questions/63207259/… He also can not select but I can. - Bowman Zhu
What type of linked service did you use? - Blue Clouds

1 Answers

0
votes

On my side it is no problem, please try to directly edit the definition json of your activity:

This is my json:

{
    "name": "pipeline3",
    "properties": {
        "activities": [
            {
                "name": "Copy data1",
                "type": "Copy",
                "dependsOn": [],
                "policy": {
                    "timeout": "7.00:00:00",
                    "retry": 0,
                    "retryIntervalInSeconds": 30,
                    "secureOutput": false,
                    "secureInput": false
                },
                "userProperties": [],
                "typeProperties": {
                    "source": {
                        "type": "BinarySource",
                        "storeSettings": {
                            "type": "AzureBlobFSReadSettings",
                            "recursive": true
                        },
                        "formatSettings": {
                            "type": "BinaryReadSettings"
                        }
                    },
                    "sink": {
                        "type": "BinarySink",
                        "storeSettings": {
                            "type": "AzureBlobFSWriteSettings"
                        }
                    },
                    "enableStaging": false,
                    "logStorageSettings": {
                        "linkedServiceName": {
                            "referenceName": "AzureDataLakeStorage1",
                            "type": "LinkedServiceReference"
                        }
                    },
                    "validateDataConsistency": false
                },
                "inputs": [
                    {
                        "referenceName": "Binary1",
                        "type": "DatasetReference"
                    }
                ],
                "outputs": [
                    {
                        "referenceName": "Binary2",
                        "type": "DatasetReference"
                    }
                ]
            }
        ],
        "annotations": []
    }
}

enter image description here

enter image description here