1
votes

There is a S3 bucket owned by a different AWS account which has a list of files. I need to copy the files to my S3 bucket. I would like to perform 2 things in order to do this:

  1. Add an S3 bucket event in the other account which will trigger a lambda to copy files in my aws account.
  2. My lambda should be provided permission (possibly through an assumed role) in order to copy the files.

What are the steps that I must perform in order to achieve 1 and 2?

3
To clarify: Do you wish to copy the files as they arrive in future, or do you just wish to copy the existing files? - John Rotenstein
I would like to copy the files as they arrive - Punter Vicky

3 Answers

6
votes

The base requirement of copying files is straight-forward:

  • Create an event on the source S3 bucket that triggers a Lambda function
  • The Lambda function copies the object to the other bucket

The complicating factor is the need for cross-account copying.

Two scenarios are possible:

  • Option 1 ("Pull"): Bucket in Account-A triggers Lambda in Account-B. This can be done with Resource-Based Policies for AWS Lambda (Lambda Function Policies) - AWS Lambda. You'll need to configure the trigger via the command-line, not the management console. Then, a Bucket policy on the bucket in Account-A needs to allow GetObject access by the IAM Role used by the Lambda function in Account-B.
  • Option 2 ("Push"): Bucket in Account-A triggers Lambda in Account-A (same account). The Bucket policy on the bucket in Account-B needs to allow PutObject access by the IAM Role used by the Lambda function in Account-A. Make sure it saves the object with an ACL of bucket-owner-full-control so that Account-B 'owns' the copied object.

If possible, I would recommend the Push option because everything is in one account (aside from the Bucket Policy).

2
votes

There is an easier way of doing it without lambda, AWS allows to set the replication of a S3 bucket( including cross region and different account), when you setup the replication all new objects will get copied to the replicated bucket, for existing objects using aws CLI just do copy object again with same bucket so that it gets replicated to target bucket, Once all existing the objects are copied you can turn off replication if you don't wise for future objects to get replicated, Here AWS does the heavy lifting for you :) https://docs.aws.amazon.com/AmazonS3/latest/dev/crr.html

1
votes

There is few ways to achieve this.

You could use SNS notification and cross account IAM to trigger the lambda. Read this: cross-account-s3-data-copy-using-lambda-function explains pretty well what you are trying to achieve.

Another approach is to deploy lambda and all the resources required in the account that holds the files. You would need to create S3 notification that triggers lambda which copies the files to your account or have cloudwatch schedule (bit like cronjob) that triggers the lambda.
In this case lambda and the trigger would have to exists in the account that holds the files.

In both scenarios minimal IAM permissions that lambda would have to have is to be able to read and write to and from s3 buckets. To use STS in order to assume role. You also need to add Cloudwatch permissions to be able to generate lambda logs.
Rest of the required IAM permissions will depend of the approach you are going to take.