1
votes

I have +7 million records stored in CSV file hosted at AWS S3 bucket and I want to load them into DynamoDB table. I've tried data AWS pipeline service but the job always failed, because this service doesn't support importing CSV format. So I should first convert CSV data into format that can be understood by DynamoDB. Is there any way to make this conversion?

1
Hi @Jaco or anyone... do you have an answer to this question?Andrew Duffy
Would a custom Python script be a suitable solution ? I am not sure there is an off the shelf solution for this.Alex
This is just data translation. A simple to write script should take care of this pretty easily.Garet Jax

1 Answers

-2
votes

AWS Datapipeline service supports CSV Import to dynamo db. You can create a pipeline from the aws console for datapipeline and choose "Import DynamoDB backup data from S3." to import CSV stored in S3 to Dynamodb.

See also

http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBPipeline.html#DataPipelineExportImport.Importing