You can't specify a file path there. You must put in the function code itself. It is limited to 4096 bytes. If your code is bigger, you need to upload it to S3 first and use S3Bucket
and S3Key
.
Example:
mastertestingLambdaDataDigestor:
Properties:
Code:
ZipFile: >
def handler(event, context):
pass
FunctionName: mastertesting_Kinesis2DynamoDB_Datapipeline
Handler: handler.kinesis_to_dynamodb
Role: SOMEROLE
Runtime: python3.6
Type: AWS::Lambda::Function
Another option is using aws cloudformation package
. It will upload the zip file for you and transform your template to one with the correct paths. For this you'll have to put the zip file path directly in Code
. For example:
Resources:
mastertestingLambdaDataDigestor:
Properties:
Code: /home/dariobenitez/Proyectos/dataflow/templates/lambda_template.zip
FunctionName: mastertesting_Kinesis2DynamoDB_Datapipeline
Handler: handler.kinesis_to_dynamodb
Role: SOMEROLE
Runtime: python3.6
Type: AWS::Lambda::Function
Then run:
aws cloudformation package --template-file my-template.yaml --s3-bucket my-bucket
It should output something like:
Resources:
mastertestingLambdaDataDigestor:
Properties:
Code:
S3Bucket: my-bucket
S3Key: fjklsdu903490f349034g
FunctionName: mastertesting_Kinesis2DynamoDB_Datapipeline
Handler: handler.kinesis_to_dynamodb
Role: SOMEROLE
Runtime: python3.6
Type: AWS::Lambda::Function
You should then use this template to deploy your stack.