I have logs on cloudwatch which I want to store on S3 everyday. I am using AWS Lambda to achieve this.
I created a function on AWS Lambda and I use Cloudwatch event as the trigger. This created an event rule on Cloudwatch. Now when I execute this lambda function, it executes successfully and a file with name 'aws-log-write-test' gets created on S3 inside the bucket, but there is no other data or file in the bucket. The file contains the text 'Permission Check Successful'.
This is my lambda function:
import boto3
import collections
from datetime import datetime, date, time, timedelta
region = 'us-west-2'
def lambda_handler(event, context):
yesterday = datetime.combine(date.today()-timedelta(1),time())
today = datetime.combine(date.today(),time())
unix_start = datetime(1970,1,1)
client = boto3.client('logs')
response = client.create_export_task(
taskName='export_cw_to_s3',
logGroupName='ABC',
logStreamNamePrefix='ABCStream',
fromTime=int((yesterday-unix_start).total_seconds()),
to=int((today-unix_start).total_seconds()),
destination='abc-logs',
destinationPrefix='abc-logs-{}'.format(yesterday.strftime("%Y-%m-%d"))
)
return 'Response from export task at {} :\n{}'.format(datetime.now().isoformat(),response)
This is the response when I execute the lambda function:
Response from export task at 2018-01-05T10:57:42.441844 :\n{'ResponseMetadata': {'RetryAttempts': 0, 'HTTPStatusCode': 200, 'RequestId': 'xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx', 'HTTPHeaders': {'x-amzn-requestid': 'xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx', 'date': 'Fri, 05 Jan 2018 10:57:41 GMT', 'content-length': '49', 'content-type': 'application/x-amz-json-1.1'}}, u'taskId': u'xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx'}
START RequestId: xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx Version: $LATEST
END RequestId: xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx
REPORT RequestId: xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxx Duration: 1418.13 ms Billed Duration: 1500 ms Memory Size: 128 MB Max Memory Used: 36 MB