0
votes

I am new with AWS Datapipeline where I need to create a job to copy dynamodb table to s3. I am using the template from here from awslab. But it is creating the backup file in my bucket with some arbitrary hash value. I want to change the file name as MyTableName.json. I tried to use filePath with steps in EmrActivity as given below:

s3://dynamodb-emr-#{myDDBRegion}/emr-ddb-storage-handler/2.1.0/emr-ddb-2.1.0.jar,org.apache.hadoop.dynamodb.tools.DynamoDbExport,#{output.filePath},#{input.tableName},#{input.readThroughputPercent}

But still I am getting the end file with hash values.

How can I change it? Please advice.

Thanks.

1

1 Answers

0
votes

That is the default behaviour of DynamoDbExport class that you are using. It also saves a manifest file in the same directory with the list of filenames. You can add a ShellCommandActivity to the Datapipeline and rename them according to your convenience after the DynamoDb backup is done.