0
votes

I am trying to Migrate a table from Database A to Database B in same Azure SQL Server. I was successful in migrating data from Azure blob storage to Azure database using Azure Data Factory, But I don't seem to understand how i can modify my ADF code to migrate data across database.

I've learned to migrate data by help of This StackOverflow Link But i am looking for migrating table using Copy Activity.

My Pipeline for Azure Blob to Azure SQL is below. Please suggest the modifications that would result in migration of data from one Azure SQL DB to another. Here is the complete Data Factory code. I am looking for a workaround or at least some resource that would guide me. Thanks in advance.

Azure SQL Linked Service

{
"name": "AzureSqlLinkedService",
"properties": {
    "description": "",
    "hubName": "dalete_hub",
    "type": "AzureSqlDatabase",
    "typeProperties": {
        "connectionString": "Data Source=tcp:server.database.windows.net,1433;Initial Catalog=DB;Integrated Security=False;User [email protected];Password=**********;Connect Timeout=30;Encrypt=True"
    }
}

}

Azure Storage Linked Service

{
"name": "AzureStorageLinkedService",
"properties": {
    "description": "",
    "hubName": "dalete_hub",
    "type": "AzureStorage",
    "typeProperties": {
        "connectionString": "DefaultEndpointsProtocol=https;AccountName=newstorageaccount;AccountKey=**********"
    }
}

}

Input Dataset

   {
    "name": "InputDataset",
    "properties": {
        "structure": [
            {
                "name": "Region",
                "type": "String"
            },
            {
                "name": "Sales",
"type": "String"
            }
        ],
        "published": false,
        "type": "AzureBlob",
        "linkedServiceName": "AzureStorageLinkedService",
        "typeProperties": {
            "fileName": "data.txt",
            "folderPath": "adfpoc/",
            "format": {"type": "TextFormat",
                "columnDelimiter": ","
            }
        },
        "availability": {
            "frequency": "Hour",
            "interval": 1
        },
        "external": true,
        "policy": {}
    }
}

Output Dataset

    {
    "name": "OutputDataset",
    "properties": {
        "structure": [
            {
                "name": "Region",
                "type": "String"
            },
            {
                "name": "Sales",
                "type": "String"
            }
        ],
        "published": false,
        "type": "AzureSqlTable",
        "linkedServiceName": "AzureSqlLinkedService",
        "typeProperties": {
            "tableName": "data"
        },
        "availability": {
            "frequency": "Hour",
            "interval": 1
        }
    }
}

ADFPipeline

  {
    "name": "ADFTutorialPipeline",
    "properties": {
        "description": "Copy data from a blob to Azure SQL table",
        "activities": [
            {
                "type": "Copy",
                "typeProperties": {
                    "source": {
                        "type": "BlobSource" },
                    "sink": {
                        "type": "SqlSink",
                        "writeBatchSize": 10000,
                        "writeBatchTimeout": "60.00:00:00"
                    }
                },
                "inputs": [
                    {
                        "name": "InputDataset"}
                ],
                "outputs": [
                    {
                        "name": "OutputDataset"
                    }
                ],
                "policy": {
                    "timeout": "01:00:00",
                    "concurrency": 1,
                    "executionPriorityOrder": "NewestFirst"
                },
                "scheduler": {
                    "frequency": "Hour",
                    "interval": 1
                },
                "name": "CopyFromBlobToSQL"
            }
        ],
        "start": "2019-03-11T00:00:00Z",
        "end": "2019-03-12T00:00:00Z",
        "isPaused": false,
        "hubName": "dalete_hub",
        "pipelineMode": "Scheduled"
    }
}
2

2 Answers

0
votes

I tried the same operation with you and I migrated my table successfully in Data Factory.

For example, I have a table table1 in database dbleon, I want to migrate table1 to another database dbleon1 with Copy Activity.

I create a new table table1 in dbmeon1 which has the same schema with table1 in dbleon. enter image description here

Here is my ADF code:

{
    "name": "CopyPipeline_0oh",
    "properties": {
        "activities": [
            {
                "name": "Copy_0oh",
                "type": "Copy",
                "policy": {
                    "timeout": "7.00:00:00",
                    "retry": 0,
                    "retryIntervalInSeconds": 30,
                    "secureOutput": false,
                    "secureInput": false
                },
                "userProperties": [
                    {
                        "name": "Source",
                        "value": "[dbo].[table1]"
                    },
                    {
                        "name": "Destination",
                        "value": "[dbo].[table1]"
                    }
                ],
                "typeProperties": {
                    "source": {
                        "type": "SqlSource"
                    },
                    "sink": {
                        "type": "SqlSink",
                        "writeBatchSize": 10000
                    },
                    "enableStaging": false,
                    "translator": {
                        "type": "TabularTranslator",
                        "columnMappings": {
                            "id": "id",
                            "name": "name"
                        }
                    }
                },
                "inputs": [
                    {
                        "referenceName": "SourceDataset_0oh",
                        "type": "DatasetReference"
                    }
                ],
                "outputs": [
                    {
                        "referenceName": "DestinationDataset_0oh",
                        "type": "DatasetReference"
                    }
                ]
            }
        ]
    },
    "type": "Microsoft.DataFactory/factories/pipelines"
}

Source Dataset:

{
    "name": "SourceDataset_0oh",
    "properties": {
        "linkedServiceName": {
            "referenceName": "AzureSqlDatabase1",
            "type": "LinkedServiceReference"
        },
        "type": "AzureSqlTable",
        "structure": [
            {
                "name": "id",
                "type": "Int32"
            },
            {
                "name": "name",
                "type": "String"
            }
        ],
        "typeProperties": {
            "tableName": "[dbo].[table1]"
        }
    },
    "type": "Microsoft.DataFactory/factories/datasets"
}

Destination Dataset:

{
    "name": "DestinationDataset_0oh",
    "properties": {
        "linkedServiceName": {
            "referenceName": "AzureSqlDatabase2",
            "type": "LinkedServiceReference"
        },
        "type": "AzureSqlTable",
        "structure": [
            {
                "name": "id",
                "type": "Int32",
                "precision": 10
            },
            {
                "name": "name",
                "type": "String"
            }
        ],
        "typeProperties": {
            "tableName": "[dbo].[table1]"
        }
    },
    "type": "Microsoft.DataFactory/factories/datasets"
}

Hope this helps.

0
votes

Moving from a sql database to another will be similar to what you have already done, but this time the source will be a sql table, like your previous sink (or output).

So what you should do is create a new linked service for the new database, then create a dataset to be the input (with the same format as your previous output, but changing the linked service name so it uses the new linked service).

Last, create a pipeline and configure properly the input and output datasets.

Hope this helped!