2
votes

I am very new to Amazon Kinesis and Amazon DynamoDB. I had purchased AWS services for my RDS MySQL data base and Dynamo DB data. Now I want to utilize my dynamo Db data.

By using AWS Console Interface I want to push DynamoDB's newly inserted records to S3 periodically and in real time. I have read about Kinesis but I am not sure how to schedule this pipeline without using application development. If it is possible Kindle guide me through. Thank you

edit I read about Kinesis Firehose and wonder if dynamoDB can publish to AWS Kinesis Firehose?

1
You can write a Lambda function (in the console) that will read from DynamoDB Stream to Kinesis Firehose, and then define Firehose to write to your S3 bucket.Guy
Lambda's description says "Run Code in response to events" , so I am wondering that does this also uses java code?Samhash
Lambda supports now JavaScript, Java and Python. You can write the JavaScript or Python code in the editor on the web console, or upload zip files with your Java (and other languages) jar files.Guy

1 Answers

5
votes

To do all of this via the console, I suggest you use DynamoDB Streams (different from Kenisis Streams) and Lambda. The aAWS announcement for this process is described this blog post.

The high level steps will be:

  1. Turn on DynamoDB Streams for your table. This step defines the producer for all of your item changes.
  2. Write the lambda function that will be the consumer. All you mention is that you want newly inserted records. This article guides you to develop a simple Hello World in Lambda using the AWS console. This is a good starting point. Once you feel confident, revisit the announcement and follow the steps there for consuming DynamoDB streams.
  3. Create some S3 buckets and append logic to the lambda function to store the new item data to S3.

This is a powerful process that mimics traditional relational database triggers but for your NoSQL tables in DynamoDB. All from the AWS console.