Challenges with Using the Artifacts Plugin for Build and Test Jobs
Opened this issue · 0 comments
I attempted to use the Artifacts Plugin for my CI/CD pipeline but encountered several issues that suggest it might not be suitable for my use case. Below is an overview of my workflow and the challenges faced:
Workflow:
Setup Phase:
- Create a
setup.tgz
archive with bundle dependencies. - Upload setup.tgz to the artifact storage.
Build Phase:
- Download
setup.tgz
in a subsequent build job. - Produce a
build.tgz
archive (containing only files with specific extensions like*.xctestrun
,*.app
,*.xctest
). - Upload
build.tgz
.
Test Phase:
- Download both
setup.tgz
andbuild.tgz
in a test job. - Produce a
test.tgz
archive and upload it.
Challenges:
Shared compressed Option:
- The
compressed
option seems to be shared between theupload
anddownload
steps, preventing the use of distinct archive names. - Using the same
compressed
value (e.g.,compressed: artifacts.tgz
) causes an error during the test job, as the agent cannot distinguish between different artifacts that already exist in the storage.
Manual Compression Issues:
Currently, I have to perform compression and decompression of the artifacts manually. When performing compression manually, the upload option does not inherit the value of environment variables defined within the step. This requires the variable to be set in the global environment or hardcoded.
env:
BUNDLE_ARTIFACTS_COMPRESSED: setup.tgz
BUILD_FOR_TESTING_ARTIFACTS_COMPRESSED: build.tgz
unit_test: &unit_test
label: "Test (unit)"
command: ".buildkite/scripts/test.sh --type unit"
env:
TEST_ARTIFACTS_COMPRESSED: test.tgz
plugins:
- artifacts#v1.9.3:
upload: TEST_ARTIFACTS_COMPRESSED # This value will not be received by the plugin
download:
- $BUILD_FOR_TESTING_ARTIFACTS_COMPRESSED # These values will be received by the plugin
- $BUNDLE_ARTIFACTS_COMPRESSED
Performance Concerns:
Compressing large files (e.g., .app and .xctest) takes approximately 5-7 minutes, which seems inefficient for an input directory of ~17GiB. I'm currently using tar -czf <archive> [files...]
to do the compression. Curious to know if there is a more optimal way to perform the compression.
Request for Assistance:
- Is there a way to distinguish between upload and download artifacts when using the compressed option?
- Can the artifact upload plugin inherit environment variables defined within the step instead of requiring global scope?
- Any tips on optimizing the compression of large directories (~17GiB) to improve efficiency?