tclahr/uac

Feature Request - Direct stream to S3

dfirhoze opened this issue · 6 comments

Hello - Wondering if there is anything on your roadmap to include syntax to send an artifact collection directly to S3 via a pre-signed URL? Thanks!

Yes, there is. I will keep this issue open for tracking it. Thanks for the suggestion.

Merged into develop branch via PR #41

@dfirhoze I have merged the code into develop branch. Can you test it pls?
You can use --s3-presigned-url URL to transfer the output file, and --s3-presigned-url-log-file URL to transfer the log file.
Also, I recommend using single quotes to enclose the URL.

Wow, thank you for the fast turn around on this!

I did test it with the below syntax, but it ends up just printing the help page to the console. Am I missing something?
./uac -p ir_triage --s3-presigned-url 'S3PUTURL'

I then used the same url to run a curl job to ensure the URL wasn't the issue and was able to send a file to the bucket:
curl -X PUT -T file 'S3PUTURL'

Again, thanks so much for working on this so quickly.

Hi,
You still need to set a destination. UAC will perform the acquisition, store the .tar.gz and .log files into the destination folder, then transfer them to S3. You can use --delete-local-on-successful-transfer to delete both .tar.gz and .log after a successful transfer.

./uac -p ir_triage /tmp --s3-presigned-url 'S3PUTURL'

Ahh right! Yes I have tested and confirmed it works! Thank you!