7
votes

I setup AWS CodeBuild to write out a large number of artifacts files into S3. The CodeBuild yaml files defines which files and CodeBuild project settings define the S3 bucket. And all works correctly.

It appears that when you trigger AWS CodeBuild from AWS CodePipeline that CodePipeline ignores the artifact settings of AWS CodeBuild and instead forces the artifact into a zip file in a codepipline S3 bucket.

Is there a way to use CodePipeline but have it respect AWS CodeBuild's artifact settings?

1
I'm looking for an answer too. Let me know if you figured out a way - manikawnth
Though this was 4 months ago and services change quickly, the quick answer at the time was no, couldn't do it. Had to add into the build script aws-cli to push the files we wanted into S3 directly -- and do that outside of the CodeBuild config. - Mike Biglan MS
Ok. Is aws-cli available as part of the codebuild environment? Even I'm planning to use a rest api in the build step to push to S3. aws-cli would be much better option - manikawnth
We went the docker route so it was an extra install. Don't know about the containers that AWS gives though - Mike Biglan MS

1 Answers

1
votes

CodeBuild also gives you access to aws-cli.

You can edit the buildspec.yaml file and upload these artifacts to S3. You can also create a .sh file, give it the right execute permissions and then use shell script to upload the artifacts to S3 Bucket.

You will also need to give the right permissions to the S3 Bucket via the Service Role for CodeBuild.