0
votes

Another Azure Data Factory question.

I'm trying to use a 'Copy Data' activity within a ForEach, setting the destination sink to an item of the foreach.

My setup is as follows:

  • Lookup activity to read a json file.

The format of the json file:

{
    "OutputFolders":[
     {   
        "Source": "aaa/bb1/Output",
        "Destination": "Dest002/bin"
     },
     {   
        "Source": "aaa/bbb2/Output",
        "Destination": "Dest002/bin"
     },
     {   
        "Source": "aaa/bb3/Output",
        "Destination": "Dest002/bin"
     }
    ]
}
  • Foreach activity with items set to @activity('Read json config').output.value[0].OutputFolders
  • Within the foreach activity a 'Copy Data' activity

This Sink has the following Sink dataset:

enter image description here

When I run this pipeline however I get the following error message:

{
    "errorCode": "2200",
    "message": "Failure happened on 'Sink' side. ErrorCode=SftpPermissionDenied,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Permission denied to access '/@item().Destination'.,Source=Microsoft.DataTransfer.ClientLibrary.SftpConnector,''Type=Renci.SshNet.Common.SftpPermissionDeniedException,Message=Permission denied,Source=Renci.SshNet,'",
    "failureType": "UserError",
    "target": "Copy output files",
    "details": []
}

So Message=Permission denied to access '/@item().Destination' seems to indicate that the destination folder is not resolved. Since this folder does not exist I get a SftpPermissionDenied.

I used the same method to copy files to a file share and there it seemed to work.

Does somebody have an idea how to make this destination resolve correctly?

1

1 Answers

0
votes

Ok, I tried some more and apparently if I use a concat function it works.

So @concat(item().Destination)

I do get a warning 'item' is not a recognized function, but it does the trick.

Not very straightforward and I wonder why the initial approach doesn't work.

enter image description here