2
votes

What is the easiest approach (easiest implies low number of service maintenance overhead. Would prefer server less approach if possible) to copy data from a DDB table in one account to another, preferably in server less manner (so no scheduled jobs using Data pipelines).

I was exploring possibility of using DynamoDB streams, however this old answer mentions that is not possible. However, I could not find latest documentation confirming/disproving this. Is that still the case?

Another option I was considering: Update the Firehose transform lambda that manipulates and then inserts data into the DynamoDB table to publish this to a Kinesis stream with cross account delivery enabled triggering a Lambda that will further process data as required.

2

2 Answers

4
votes

This should be possible

  • configure DynamoDB table in the source account with Stream enabled
  • create Lambda function in the same account (source account) and integrate it with DDB Stream
  • create cross-account role, i.e DynamoDBCrossAccountRole in the destination account with permissions to do necessary operations on the destination DDB table (this role and destination DDB table are in the same account)
  • add sts:AssumeRole permissions to your Lambda function's execution role in addition to logs permissions for CloudWatch so that it can assume the cross-account role
  • call sts:AssumeRole from within your lambda function and configure DynamoDB client with these permissions, example:
client = boto3.client('sts')
sts_response = client.assume_role(RoleArn='arn:aws:iam::<999999999999>:role/DynamoDBCrossAccountRole',                              
                                      RoleSessionName='AssumePocRole', DurationSeconds=900)

dynamodb = boto3.resource(service_name='dynamodb', region_name=<region>,
                              aws_access_key_id = sts_response['Credentials']['AccessKeyId'],
                              aws_secret_access_key = sts_response['Credentials']['SecretAccessKey',
                              aws_session_token = sts_response['Credentials']['SessionToken']) 
  • now your lambda function should be able to operate on the DynamoDB in the destination account from the source account
0
votes

We kind of created replication system for cross account using DynamoDB streams and Lambda for a hackathon task.
You might see some delay in the records though, because of Lambdas coldstart issue. There are ways to tackle this problem too depends on how busy you are going to keep Lambda, here is the link.

We actually created a cloudformation and a jar which can used by anyone internal to our orgainisation to start replication on any table. Won't be able to share due to security concerns.

Please check out this link for more details.