This repo holds a simple caddyserver v2 upload handler
Warning
|
You have to secure the upload URL as the nature of this handler/plugin is to save uploaded artifacts on the disc. |
In this handler are the following modules in use.
-
templates
-
file_server
-
github.com/caddyserver/jsonc-adapter
-
upload :-)
Name | Description |
---|---|
dest_dir |
The directory in which the files will be uploaded |
file_field_name |
The field name for the multi-part form upload |
response_template |
The response template after a upload was successfully |
max_form_buffer |
The maximum buffer size for ParseMultipartForm. It accepts all size values supported by go-humanize. Here can also be used a Placeholder The default size is 1G. |
max_form_buffer_int |
The maximum buffer size for ParseMultipartForm. |
max_filesize |
is the maximum size in bytes allowed for the upload.
It accepts all size values supported by go-humanize. Reads of more bytes will return an error with HTTP status 413. Here can also be used a Placeholder The default size is 1G. |
max_filesize_int |
is the maximum size in bytes allowed for the upload. Reads of more bytes will return an error with HTTP status 413. |
notify_url |
After a successful upload will this URL be called. The only supported schema is https. |
notify_method |
The default method is |
insecure |
This boolean flag configure the |
capath |
This is the Parameter where you can define the CA filename for the notify_url. |
Because I prefer the JSON Config will I write here the configuration snipplet in JSON Syntax.
"handle": [
{
"MyTlsSetting": {},
"dest_dir": "upload", (1)
"handler": "upload",
"max_filesize": "5GB", (2)
"max_form_buffer_int": 1000000, (3)
"response_template": "upload-resp.txt" (4)
}
]
-
Destination Directory on the Server site
-
Maximal possible upload size
-
Maximal buffer for uploading
-
the response template which will be used for response after upload
A full working example is in
docker-files/opt/webroot/config/Caddyfile-upload.json
Here a example Caddyfile which expects that the environment variable
APPPORT
is set.
{
order upload before file_server
log {
level DEBUG
}
}
{$APPPORT} {
root .
file_server browse
templates
@mypost method POST
upload @mypost {
dest_dir tmp-upl
max_form_buffer 1G
max_filesize 4MB
response_template templates/upload-resp-template.txt
}
log {
output file access.log
}
}
You can get this image from docker hub
The default listen port must be defined with this variable
APPPORT=:2011
---
podman run --rm --network host --name caddy-test \
--env APPPORT=:8888 -it \
docker.io/me2digital/caddyv2-upload:latest
# or
docker run --name caddy-test --rm \
docker.io/me2digital/caddyv2-upload:latest
---
When you run the Image with port 8888 can you use curl or any other tool to post (upload) files
It’s not necessary to use -X POST
as written in this Blog post
UNNECESSARY USE OF CURL -X
Here a example call with curl
curl -v --form myFile=@README.adoc http://localhost:8888/templates/upload-template.html
* Trying 127.0.0.1:8888...
* TCP_NODELAY set
* Connected to localhost (127.0.0.1) port 8888 (#0)
> POST /templates/upload-template.html HTTP/1.1
> Host: localhost:8888
> User-Agent: curl/7.68.0
> Accept: */*
> Content-Length: 2492
> Content-Type: multipart/form-data; boundary=------------------------58b770bc61c0e691
> Expect: 100-continue
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 100 Continue
* We are completely uploaded and fine
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Accept-Ranges: bytes
< Content-Length: 299
< Etag: "rbb1gx8b"
< Last-Modified: Tue, 03 May 2022 11:34:09 GMT
< Server: Caddy
< Date: Thu, 19 May 2022 21:45:07 GMT
<
http.request.uri.path: {{placeholder "http.request.uri.path"}}
http.request.uuid {{placeholder "http.request.uuid" }}
http.request.host {{placeholder "http.request.host" }}
http.upload.filename: {{placeholder "http.upload.filename"}}
http.upload.filesize: {{placeholder "http.upload.filesize"}}
The max_form_buffer paramater will be directly passed to readForm function and is used to check if the uploaded file should be saved temporarly on disk or keep it in the memory. This have dicret impact into the performance and disk usage of that module. Keep in mind when this paramter is low and the upload is a big file then will be there a lot of disk io.
INFO: The observation from https://github.com/etherwvlf in issue Memory issues on large uploads was that the initial memory usage is 7-8 times higher then the configured max_form_buffer size.