I also ran into similar issue. My requirement was bit different. I will explain the issue and solution we used so that this might be helpful for someone else. I started with approach explained by @marianogg9.
In our case, i was using github as source trying to deploy a stackset which contains a lambda function. So i had a lambda python file and template yml file in github. To deploy, i was using below steps which ran into issue that lambda function is not getting updated even after deploying the stackset everytime.
- get files from github
- zip lambda python file and upload to an s3 bucket (existing bucket).
- yml template contains this bucket path and s3key predefined. So everytime we were overwriting the s3 lambda file after we execute code pipeline. But as per aws documentation
Changes to a deployment package in Amazon S3 are not detected automatically during stack updates. To update the function code, change the object key or version in the template.
which means you need to either use a new bucket or use a new version everytime. So for this, first i enabled versioning for s3 bucket. Then i created a parameter versionId in the template file. So my idea was to upload zip file to s3 and then use cli command to get the latest version id and then override the versionid parameter using parameter override property of create stack instances cli command. Since i had to create a stackset and code pipeline by default wont support parameter override option if action provider is cloudformation stackset. So instead of that i used a code build stage again and then used cli commands to create stackset and then create stack instances. Using CLI commands for stack instances allowed me to use the parameter override property and thus i was able to update the version id parameter used in the template which will eventually update the lambda function after its deployed.