Can some one help me how to put multiple files from a s3 folder to another using aws cli lambda invoke.
Updated question
Requirement:
- Multiple files loaded in source bucket/folder.These files retention is 1 day.
- So I need to copy to the archive folder in another bucket..
- Target folder structure bucket/folder/yyy-mm-dd/
Existing process: s3 event created in source bucket with prefix enabled for specific folder. Lambda code (pasted below doing the work).
Issues: If I have to invoke some day manually lambda function, how to pass the key. As the keys are multiple files.
Existing code for single file:
aws lambda invoke --function-name abcdeffe_job --payload '{"Records":[{"eventTime": "2020-03-07T23:38:16.762Z","s3":{"bucket": {"name": "xxxx-test"},"object": {"key": "lambda-test/account.csv"}}}]}' abc.txt
I have got mulitple account file such as account_1.csv, account_2.csv **** so on. How can I pass the key here?
Lambda code:
import json
import boto3
import urllib.parse
import time
s3_client =boto3.client('s3')
def lambda_handler(event, context):
file_event_time = event['Records'][0]['eventTime']
print("file_event_time :",file_event_time)
ts = time.strptime(file_event_time[:19], "%Y-%m-%dT%H:%M:%S")
dt=time.strftime("%Y-%m-%d", ts)
# Bucket Name where file was uploaded
source_bucket_name = event['Records'][0]['s3']['bucket']['name']
print("source_bucket_name : ",source_bucket_name)
# Bucket Name where file to be uploaded
destination_bucket_name = 'test'
# Filename of object (with path)
#file_key_name = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'],encoding='utf-8')
file_key_name = event['Records'][0]['s3']['object']['key']
print("file_key_name : ",file_key_name)
file_name=file_key_name.split("/")[1]
print("file_name : ",file_name)
# Copy Source Object
copy_source_object = {'Bucket': source_bucket_name, 'Key': file_key_name}
print(" The target folder to be created for :",dt)
destination_file_path ="art_jobs"+"/"+dt+"/"+file_name
print("destination_file_path : ",destination_file_path)
try:
response = s3_client.get_object(Bucket=source_bucket_name, Key=file_key_name)
print("response :",response)
s3_client.copy_object(CopySource=copy_source_object, Bucket=destination_bucket_name, Key=destination_file_path)
except Exception as e:
print(e)
raise e
abc.txt
at the end of your sampleinvoke
command? Basically, please describe the end-to-end process you would like to implement. – John Rotenstein