Can't upload single file using wildcard
Orrison opened this issue · 6 comments
Unable to upload a single file generated in the job using a wildcard. I have input it into SOURCE_DIR
and though it would work as the documentation for SOURCE_DIR
states "The local directory (or file) you wish to sync/upload to S3."
I have to use a wildcard for a part of the file name as I can't always be sure what the version name is. E.X.:
SOURCE_DIR: 'package-name-*.zip'
But it always fails with the error:
warning: Skipping file /github/workspace/package-name-1.0.0.zip/. File does not exist.
Obviously it sees the file or else it wouldn't have been able to grab that 1.0.0 version number so it is strange that can't find it. I do see that it has /.
at the end so I wonder if it is thinking it is a folder? Is there some way I can make it understand it is a single file and not a folder?
I was able to get this to work by not included SOURCE_DIR
and instead running it with these arguments:
with:
args: --exclude '*' --include '/package-name-*.zip'
Leaving this open to get confirmation this is the best way to do it.
I run into the same issue even when not using a wildcard. I just want to upload a single file, not a directory. I expect the following workflow to be working:
on:
push:
branches:
- s3upload
jobs:
test-s3-upload:
runs-on: ubuntu-latest
name: Upload artifact to S3
steps:
- name: Prepare artifact
run: |
echo "Hello World" > README.txt
touch "$(date).txt"
printenv | grep GITHUB_ > env.txt
tar cf archive.tar *.txt
ls -l
pwd
- name: Upload to S3
uses: jakejarvis/s3-sync-action@7ed8b11
with:
args: --acl public-read
env:
AWS_S3_BUCKET: ${{ secrets.TEST_AWS_S3_BUCKET }}
AWS_ACCESS_KEY_ID: ${{ secrets.TEST_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.TEST_AWS_SECRET_ACCESS_KEY }}
AWS_REGION: 'eu-central-1'
SOURCE_DIR: 'archive.tar'
DEST_DIR: 'test'
It fails with the following error:
warning: Skipping file /github/workspace/archive.tar/. File does not exist.
It says "warning" but the step is marked with ❌ and the job fails.
I get an expected result with workaround proposed by @Orrison:
- name: Upload to S3
uses: jakejarvis/s3-sync-action@7ed8b11
with:
args: --acl public-read --exclude '*' --include 'archive.tar'
env:
AWS_S3_BUCKET: ${{ secrets.TEST_AWS_S3_BUCKET }}
AWS_ACCESS_KEY_ID: ${{ secrets.TEST_AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.TEST_AWS_SECRET_ACCESS_KEY }}
AWS_REGION: 'eu-central-1'
DEST_DIR: 'test'
I've faced the same issue.
@jakejarvis maybe add new environment value SOURCE_FILE
and when it's set, use the suggested command #26 (comment)
?
I faced the same issue. Right now I'm solving it by using this solution: #26 (comment)
Ditto.. doesn't work for me either
Here is the console message, bucket name changed to generic name.
root@processor:~# aws s3 sync /root/data-processed/*.csv s3://bucket/sub
warning: Skipping file /root/data-processed/2023-04-22T07:51:48.595750.out.csv/. File does not exist.
So, it clearly knows the file exists but doesn't do anything about it.