1
votes

enter image description here

Here is how my ADF Pipeline looks like. In Data Flow, I read some data from a source, perform filter & join and store data to a sink. My plan was to use Azure Table Storage as the sink. However, according to https://github.com/MicrosoftDocs/azure-docs/issues/34981, ADF Data Flow does not support Azure Table Storage as a sink. Is there an alternative to use Azure Table Storage as the sink in Data Flow?

2
No, that is impossible. Have a look of this doc: docs.microsoft.com/en-us/azure/data-factory/…Bowman Zhu

2 Answers

2
votes

No, it is impossible. Azure Table Storage can not be the sink of data flow.

Only these six dataset is allowed:

enter image description here

Not only these limits. When as the sink of the dataflow, Azure Blob Storage and Azure Data Lake Storage Gen1&Gen2 only support four format: JSON, Avro, Text, Parquet.'

At least for now, your idea is not a viable solution.

For more information, have a look of this offcial doc:

https://docs.microsoft.com/en-us/azure/data-factory/data-flow-sink#supported-sink-connectors-in-mapping-data-flow

1
votes

Even today it isn't possible. One option could be (we are solving a similar case like this currently) to use a Blob Storage as a temporary destination.

  1. The data flow will store the result in the Blob Storage. The source data is processed by all these different transformations in the data flow and prepared well for table storage, e.g. PartitionKey, RowKey, and all other columns are there.

  2. A subsequent Copy Activity will move the data from Blob Storage into Table Storage easily.

The marked part of the pipeline is doing exactly this:

  • Full Orders runs the data flow
  • to Table Storage copy activity moves data into the Table Storage

Pipeline