1
votes

My aim is to use Azure Data Factory to copy data from one place to another using REST API. The first part of the copying is using the ForEach activity to select parameters from a nested JSON/array. I have had problems with this nested JSON because of various error messages in the ForEach activity.

My JSON is of the following form:

(
    {
        "key_1": "value_1",
        "key_2": [
            "value_2_1",
            "value_2_2"
        ]
    }
)

and first I'm setting the @json conversion function in front of it:

@json(
     '{
        "key_1": "value_1",
        "key_2": [
        "value_2_1",
        "value_2_2"
      ]
     }'
)

Here you can see the Execute Pipeline object and its parameters:

I am setting this JSON as a parameter in the Execute Pipeline object. I am setting its type as an "Array". (For the record, I have still had similar error messages even though I have tried to change the parameter type to "String" or "Object".)

The ForEach activity is used to select an item from the nested JSON, which is written into a parameter.

This immediately produces the following error in the Execute Pipeline activity:

Operation on target... ...failed: The function 'length' expects its parameter to be an array or a string. The provided value is of type 'Object'.

So, even though I set the JSON parameter type to "Array", it is changed into "Object" when I debug the pipeline activity. See the error below:

Next, I tried to use the ADF @createArray function before my JSON text.

@createArray(
    '{
        "key_1": "value_1",
        "key_2": [
        "value_2_1",
        "value_2_2"
      ]
     }'
)

When debugging, the forEach activity throws me an error in the first Copy data activity:

The expression 'concat(item().SELECTING_key_1_FROM_MY_JSON))' cannot be evaluated because property 'key_1' cannot be selected. Property selection is not supported on values of type 'String'.

Please help me, what I am doing wrong in my attempts of converting the JSON to an array? What should I change in my code?

So far I have tried changing the parameter type and using various functions in the JSON dynamic content but with no luck.

2

2 Answers

1
votes

update:
If you want change the filename in Copy Data --> Sink. enter image description here You can key in the dynamic content @concat(pipeline().parameters.Pip_Object.key_1,'.json') to rename the file. enter image description here

Please correct me if I understand you wrong.


  1. First, we should use Parameters to store the Json array as follows. Because Variables are not support to store the Json array. Variables are only support to store simple data type such as ["1","2","3"...].
    enter image description here

  2. Json array format should as follows:
    [{"key_1": "value_1"},{"key_2": ["value_2_1","value_2_2"]}]

  3. I created a simple test here. Then ForEach the Json array:
    enter image description here

  4. Inside ForEach1 activity, pass @item() to the object type parameter Pip_Object.
    enter image description here

  5. The Input is as follows:
    enter image description here

0
votes

We ended up using two pipelines: a generic one and a specific one.

The generic one has the JSON as a parameter (JSON_PARAMETER), and we set it a default value in the following form

[{"key_1":"value_1","key_2":["value_2"]},{"key_1":"value_3","key_2":["value_2"]}, ...etc. ...}]

The generic pipeline has a forEach loop, in which that mentioned JSON parameter is called in Settings -> Items:

@JSON(pipeline().parameters.JSON_PARAMETER)

In the specific pipeline, there is an Execute Pipeline activity, where the JSON_PARAMETER is found in Settings -> Parameters. If the default value of JSON_PARAMETER is used, the field is left blank. If we want to change the parameter, before Execute Pipeline, we put a Set Variable activity where we change the Variables -> Value to:

@concat('
[
    {"key_1":"value_1",
     "key_2":
            ["value_2",
             "value_3"
            ]
     },
     {"key_1":"value_3",

    ...and so on...

    }
]
')