5
votes

I'm trying to migrate a SQL Server table using AWS DMS to a DynamoDb target. The table structure is as follows:

|SourceTableID|Title      |Status|Display|LongDescription|
|-------------|-----------|------|-------|---------------|
|VARCHAR(100) |VARCHAR(50)|INT   |BIT    |NVARCHAR(MAX)  |

Every field is being migrated without errors and is present in my target DynamoDb table except for the LongDescription column. This is because it is a NVARCHAR(MAX) column.

According to the documentation:

The following limitations apply when using DynamoDB as a target:

  • AWS DMS doesn't support LOB data unless it is a CLOB. AWS DMS converts CLOB data into a DynamoDB string when migrating data.

https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Target.DynamoDB.html

Source Data Types for SQL Server

|SQL Server Data Types|AWS DMS Data Types|
|----------------------------------------|
|NVARCHAR (max)       |NCLOB, TEXT       |

https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Source.SQLServer.html

Depending on my task configuration the following two scenarios occur:

  • Limited LOB mode: Information for the LongDescription column is being migrated properly to DynamoDb, however the text is truncated
  • Full LOB mode: Information for the LongDescription column is NOT migrated properly to DynamoDb

How can I correctly migrate an NVARCHAR(MAX) column to DynamoDb without losing any data? Thanks!

Progress Report

  1. I have already tried migrating to an S3 target. However it looks like S3 doesnt support Full LOB

    Limitations to Using Amazon S3 as a Target

    • Full LOB mode is not supported.

https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Target.S3.html

  1. I cannot use the compress T-SQL command in order to store the LongDescription column as a binary, since my SQLServer version is 2014

  2. I tried to run the migration task to Limited LOB mode and use the maximum byte size as the limit. My maximum byte size is 45155996 so I set 46000KB as the limit. This results in an error as follows:

Failed to put item into table 'TestMigration_4' with data record with source PK column 'SourceTableID' and value '123456'

1

1 Answers

4
votes

You might want to check this AWS' best practices page for storing large items/attributes in DynamoDB:

If your application needs to store more data in an item than the DynamoDB size limit permits, you can try compressing one or more large attributes or breaking the item into multiple items (efficiently indexed by sort keys). You can also store the item as an object in Amazon Simple Storage Service (Amazon S3) and store the Amazon S3 object identifier in your DynamoDB item.

I actually like the idea of saving your LongDescription in S3 and referencing its identifier in DynamoDB. I never tried, but an idea would be to use their DMS ability to create multiple migration tasks to perform this, or even create some kind of ETL solution as last resort, making use of DMS' CDC capability. You might want to get in touch with their support team to make sure it works.

Hope it helps!