I'm trying to hook my GitHub repo with S3 so every time there's a commit, AWS CodePipeline will deploy the ./<path>/public
folder to a specified S3 bucket
.
So far in my pipeline, the Source
works (hooked to GitHub and picks up new commits) but the Deploy
failed because: Action execution failed
BundleType must be either YAML or JSON
.
This is how I set them up:
CodePipeline
- Action name: Source
- Action provider: GitHub
- Repository: account/repo
- Branch: master
- GitHub webhooks
CodeDeploy
- Compute type: AWS Lambda
- Service role: myRole
- Deployment settings: CodeDeployDefault.LambdaAllAtOnce
IAM Role: myRole
- AWS Service
- Choose the service that will use this role: Lambda / CodeDeploy
- Select your use case: CodeDeploy
- Policies: AWSCodeDeployRole
I understand that there must be a buildspec.yml
file in the root folder. I've tried using a few files I could find but they don't seem to work. What did I do wrong or how should I edit the buildspec
file to do what I want?
Update
Thanks to @Milan Cermak. I understand I need to do:
CodePipeline:
- Stage 1: Source: hook with GitHub repo. This one is working.
- Stage 2: Build: use CodeBuild to grab only the wanted folder using a
buildspec.yml
file in the root folder of the repo. - Stage 3: Deploy: use
Action Provider: S3
Input Artifacts: OutputArtifacts (result of stage 2).
Bucket: the bucket that hosts the static website.
CodePipeline works. However, the output contains only files (.html) not folders nested inside the public folder.
I've checked this and figured how to remove path of a nested folder with discard-paths: yes
but I'm unable to get all the sub-folders inside the ./<path>/public
folder. Any suggestion?