问题
I've set up a CodePipeline with the end goal of having a core service reside on S3 as a private maven repo for other pipelines to rely on. When the core service is updated and pushed to AWS CodeCommit, the pipeline should run, test it, build a jar using a maven docker image, then push the resulting jar to S3 where it can be accessed by other applications as needed.
Unfortunately, while the CodeBuild service works exactly how I want it to, uploading XYZCore.jar to /release on the bucket, the automated pipeline itself does not. Instead, it uploads to a "XYZCorePipeline" folder, which contains the input and output artifacts of the build. The output artifact itself is a zip file that's just a random string of characters. I checked the pipeline and it's using the service correctly, but pipeline specific builds always output there while standalone builds of the CodeBuild service output how I'd like them, allowing me to take advantage of things like versioning. What's the best way for me to fix the settings so the two builds match?
回答1:
Unfortunately CodePipeline does not support this use case.
As a workaround you could upload the artifact to S3 by invoking AWS CLI (aws s3 cp ...) from your buildspec.yml post_build.
来源:https://stackoverflow.com/questions/43131439/aws-codepipeline-adding-artifacts-to-s3-in-less-useful-format-than-running-steps