S3 Uploads Lightweight "drop-in" for storing WordPress uploads on Amazon S3 instead of the local filesystem. |
|
A Human Made project. Maintained by @joehoyle. |
S3 Uploads is a WordPress plugin to store uploads on S3. S3 Uploads aims to be a lightweight "drop-in" for storing uploads on Amazon S3 instead of the local filesystem.
It's focused on providing a highly robust S3 interface with no "bells and whistles", WP-Admin UI or much otherwise. It comes with some helpful WP-CLI commands for generating IAM users, listing files on S3 and Migrating your existing library to S3.
- PHP >= 7.1
- WordPress >= 5.3
composer require humanmade/s3-uploads
Note: Composer's autoloader must be loaded before S3 Uploads is loaded. We recommend loading it in your wp-config.php
before wp-settings.php
is loaded as shown below.
require_once __DIR__ . '/vendor/autoload.php';
If you do not use Composer to manage plugins or other dependencies, you can install the plugin manually. Download the manual-install.zip
file from the Releases page and extract the ZIP file to your plugins
directory.
You can also git clone
this repository, and run composer install
in the plugin folder to pull in its dependencies.
Once you've installed the plugin, add the following constants to your wp-config.php
:
define( 'S3_UPLOADS_BUCKET', 'my-bucket' );
define( 'S3_UPLOADS_REGION', '' ); // the s3 bucket region (excluding the rest of the URL)
// You can set key and secret directly:
define( 'S3_UPLOADS_KEY', '' );
define( 'S3_UPLOADS_SECRET', '' );
// Or if using IAM instance profiles, you can use the instance's credentials:
define( 'S3_UPLOADS_USE_INSTANCE_PROFILE', true );
Please refer to this region list http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region for the S3_UPLOADS_REGION values.
Use of path prefix after the bucket name is allowed and is optional. For example, if you want to upload all files to 'my-folder' inside a bucket called 'my-bucket', you can use:
define( 'S3_UPLOADS_BUCKET', 'my-bucket/my-folder' );
You must then enable the plugin. To do this via WP-CLI use command:
wp plugin activate S3-Uploads
The plugin name must match the directory you have cloned S3 Uploads into; If you're using Composer, use
wp plugin activate s3-uploads
The next thing that you should do is to verify your setup. You can do this using the verify
command
like so:
wp s3-uploads verify
You will need to create your IAM user yourself, or attach the necessary permissions to an existing user, you can output the policy via wp s3-uploads generate-iam-policy
S3-Uploads comes with a WP-CLI command for listing files in the S3 bucket for debugging etc.
wp s3-uploads ls [<path>]
If you have an existing media library with attachment files, use the below command to copy them all to S3 from local disk.
wp s3-uploads upload-directory <from> <to> [--verbose]
For example, to migrate your whole uploads directory to S3, you'd run:
wp s3-uploads upload-directory /path/to/uploads/ uploads
There is also an all purpose cp
command for arbitrary copying to and from S3.
wp s3-uploads cp <from> <to>
Note: as either <from>
or <to>
can be S3 or local locations, you must specify the full S3 location via s3://mybucket/mydirectory
for example cp ./test.txt s3://mybucket/test.txt
.
WordPress (and therefor S3 Uploads) default behaviour is that all uploaded media files are publicly accessible. In certain cases which may not be desireable. S3 Uploads supports setting S3 Objects to a private
ACL and providing temporarily signed URLs for all files that are marked as private.
S3 Uploads does not make assumptions or provide UI for marking attachments as private, instead you should integrate the s3_uploads_is_attachment_private
WordPress filter to control the behaviour. For example, to mark all attachments as private:
add_filter( 's3_uploads_is_attachment_private', '__return_true' );
Private uploads can be transitioned to public by calling S3_Uploads::set_attachment_files_acl( $id, 'public-read' )
or vica-versa. For example:
S3_Uploads::get_instance()->set_attachment_files_acl( 15, 'public-read' );
The default expiry for all private file URLs is 6 hours. You can modify this by using the s3_uploads_private_attachment_url_expiry
WordPress filter. The value can be any string interpreted by strtotime
. For example:
add_filter( 's3_uploads_private_attachment_url_expiry', function ( $expiry ) {
return '+1 hour';
} );
You can define the default HTTP Cache-Control
header for uploaded media using the
following constant:
define( 'S3_UPLOADS_HTTP_CACHE_CONTROL', 30 * 24 * 60 * 60 );
// will expire in 30 days time
You can also configure the Expires
header using the S3_UPLOADS_HTTP_EXPIRES
constant
For instance if you wanted to set an asset to effectively not expire, you could
set the Expires header way off in the future. For example:
define( 'S3_UPLOADS_HTTP_EXPIRES', gmdate( 'D, d M Y H:i:s', time() + (10 * 365 * 24 * 60 * 60) ) .' GMT' );
// will expire in 10 years time
As S3 Uploads is a plug and play plugin, activating it will start rewriting image URLs to S3, and also put
new uploads on S3. Sometimes this isn't required behaviour as a site owner may want to upload a large
amount of media to S3 using the wp-cli
commands before enabling S3 Uploads to direct all uploads requests
to S3. In this case one can define the S3_UPLOADS_AUTOENABLE
to false
. For example, place the following
in your wp-config.php
:
define( 'S3_UPLOADS_AUTOENABLE', false );
To then enable S3 Uploads rewriting, use the wp-cli command: wp s3-uploads enable
/ wp s3-uploads disable
to toggle the behaviour.
By default, S3 Uploads will use the canonical S3 URIs for referencing the uploads, i.e. [bucket name].s3.amazonaws.com/uploads/[file path]
. If you want to use another URL to serve the images from (for instance, if you wish to use S3 as an origin for CloudFlare), you should define S3_UPLOADS_BUCKET_URL
in your wp-config.php
:
// Define the base bucket URL (without trailing slash)
define( 'S3_UPLOADS_BUCKET_URL', 'https://your.origin.url.example/path' );
S3 Uploads' URL rewriting feature can be disabled if the current website does not require it, nginx proxy to s3 etc. In this case the plugin will only upload files to the S3 bucket.
// disable URL rewriting alltogether
define( 'S3_UPLOADS_DISABLE_REPLACE_UPLOAD_URL', true );
The object permission of files uploaded to S3 by this plugin can be controlled by setting the S3_UPLOADS_OBJECT_ACL
constant. The default setting if not specified is public-read
to allow objects to be read by anyone. If you don't
want the uploads to be publicly readable then you can define S3_UPLOADS_OBJECT_ACL
as one of private
or authenticated-read
in you wp-config file:
// Set the S3 object permission to private
define('S3_UPLOADS_OBJECT_ACL', 'private');
For more information on S3 permissions please see the Amazon S3 permissions documentation.
Depending on your requirements you may wish to use an alternative S3 compatible object storage system such as Minio, Ceph, Digital Ocean Spaces, Scaleway and others.
You can configure the endpoint by adding the following code to a file in the wp-content/mu-plugins/
directory, for example wp-content/mu-plugins/s3-endpoint.php
:
<?php
// Filter S3 Uploads params.
add_filter( 's3_uploads_s3_client_params', function ( $params ) {
$params['endpoint'] = 'https://your.endpoint.com';
$params['use_path_style_endpoint'] = true;
$params['debug'] = false; // Set to true if uploads are failing.
return $params;
} );
If your S3 access is configured to require a temporary session token in addition to the access key and secret, you should configure the credentials using the following code:
// Filter S3 Uploads params.
add_filter( 's3_uploads_s3_client_params', function ( $params ) {
$params['credentials']['token'] = 'your session token here';
return $params;
} );
While it's possible to use S3 Uploads for local development (this is actually a nice way to not have to sync all uploads from production to development), if you want to develop offline you have a couple of options.
- Just disable the S3 Uploads plugin in your development environment.
- Define the
S3_UPLOADS_USE_LOCAL
constant with the plugin active.
Option 2 will allow you to run the S3 Uploads plugin for production parity purposes, it will essentially mock
Amazon S3 with a local stream wrapper and actually store the uploads in your WP Upload Dir /s3/
.
Created by Human Made for high volume and large-scale sites. We run S3 Uploads on sites with millions of monthly page views, and thousands of sites.
Written and maintained by Joe Hoyle. Thanks to all our contributors.
Interested in joining in on the fun? Join us, and become human!