uploadcare/uploadcare-php

copyToRemoteStorage issue

Closed this issue · 2 comments

Question

We are having an issue and wondered if you had any information that could help us.

We are using PHP and the API, we are uploading an image and then processing the image, resizing and creating webp versions, and then copying the image to our S3 bucket.

This is all working as expected, however, the issue we are having is that when the copyToRemoteStorage processes, it's locking up our server and we are not able to get to any pages. Once the process is complete the server access is restored. The lockup is for any user on the server. We are not seeing any high loads or abnormalities in the stats.

My function is below and would appreciate any information you may have.

Thanks,
Steve

The First process took 3.0101509094238 seconds.
The Second process took 9.1781148910522 seconds.
APP/Controller/MediaController.php (line 163)
‘complete’
3 seconds to copy, 9 seconds to make the webp and copy

Here is my function:

public function copyToAws() {

        $uploadcareConfig = Configuration::create(
            Configure::read('Uploadcare.publicKey'),
            Configure::read('Uploadcare.secretKey')
        );

        $api = (new Api($uploadcareConfig))->file();

        $target = '{bucket}'; // This is the bucket registered with UploadCare this maps to bucket on our S3

        /** These are the variable values needed to copy the file */
        $source_uuid = ''; // The uuid of the uploaded file
        $filename = '07182024.webp'; // The new filename for the customer filename
        $webid = $this->request->getSession()->read('Global.webid');
        $target = '{bucketname}'; // This is the bucket registered with UploadCare


        $folder = $webid; // Our customers folder (webid)
        /** End variables */
        
        $fileInfo = $api->fileInfo($source_uuid);

        $timestamp = time();

        $contentInfo = $fileInfo->getContentInfo();
        $imageInfo = $contentInfo->getImage();
        $imageHeight = $imageInfo->getHeight();
        $imageWidth = $imageInfo->getWidth();
        $filenameWithoutExtension = pathinfo($filename, PATHINFO_FILENAME);
        $extension = pathinfo($filename, PATHINFO_EXTENSION);
        $newFilenameWithTimestamp = pathinfo($filename, PATHINFO_FILENAME) . "-" . $timestamp . ".". $extension;
        $newFilename = $newFilenameWithTimestamp;

        $startTime = microtime(true);

            // Copy original image to S3 bucket
            $original_target_pattern = $folder.'/'.$newFilenameWithTimestamp;
            $api->copyToRemoteStorage($source_uuid, $target, true, $original_target_pattern);


            // Step 3: Capture the end time
            $endTime = microtime(true);

            // Step 4: Calculate the duration
            $duration = $endTime - $startTime;

            // Output the duration
            echo "The First process took " . $duration . " seconds.<br>";


        $widths = [640, 768, 1024, 1280, 1536];
        $startTime = microtime(true);

                //Save different size images to S3
                foreach ($widths as $width) {
                    if($imageWidth > $width){
                        $resize_params = '/-/resize/' . $width . 'x/';
                        
                        if($extension != 'svg'){
                            $format_params = '-/format/webp/';
                            $newFilename = $filenameWithoutExtension . "-" . $timestamp  . "-" . $width . "w.webp";
                        }

                        $sourceUrl = $source_uuid . $resize_params . $format_params;
                        // echo $sourceUrl . "<br>";
                        $target_pattern = $folder . '/' . $newFilename;
                        // echo $target_pattern . "<br><br>";

                        try{
                            // Copy image to S3 bucket
                            $api->copyToRemoteStorage($sourceUrl, $target, true, $target_pattern);
                        } catch(\Exception $e){
                           dd($e->getMessage());
                        }
                    }
                    else{
                        break;
                    }
                }

                $endTime = microtime(true);

                // Step 4: Calculate the duration
                $duration = $endTime - $startTime;
    
                // Output the duration
                echo "The Second process took " . $duration . " seconds.<br>";
    
        dd('complete');

    }

@rpstevefysh, it's understandable and comes from the nature of PHP. While the server is working, the client is waiting. You don't have any performance issues, server overloads, or something like that — everything really works as expected.

What you do see is network communications. While the file is being uploaded from the Uploadcare server instance to the S3 bucket, you will not receive a response and can't do anything in the current PHP process.

My only advice here is to use a queue on your side to make this operation asynchronous. I know it requires some infrastructure changes (at least, you have to have a queue manager and run the command on a server), but those changes are not so big and pretty understandable. For example, you can use the Redis service to make a queue and a simple SystemD configuration to keep it working.

Then, you can make a few messages with parameters and send them to the bus. Something like

<?php

final class CopyToRemoteMessage()
{
    public function __construct(
        readonly private string $sourceUrl,
        readonly private string $target,
        readonly private string $targetPattern,
    ) {}

    // ... your getters for properties
}

as the message class, and simply make those messages in your controller and send them

$original_target_pattern = $folder . '/' . $newFilenameWithTimestamp;
/**
 * Do not forget to inject the MessageBusInterface into your controller.
 * After this call, your code will continue immediately, and a message will be processed in a consumer (see below).
 */
$this->bus->dispatch(new CopyToRemoteMessage($source_uuid, $target, $original_target_pattern));

And example consumer could be list this:

<?php

use Uploadcare\Api;

final class CopyToRemoteConsumer
{
    public function __construct(
        readonly private Api $api,
    ) {}

    public function __invoke(CopyToRemoteMessage $message): void
    {
        $this->api->file()->copyToRemoteStorage($message->getSourceUrl(), true, $message->getTarget(), $message->getTargetPattern());
    }
}

In this way, you'll have a controller method that answers in milliseconds and a background process related to the actual uploading process. Of course, it will not speed up the uploading process itself; you'll have to wait a bit before that file is really uploaded to the S3 storage and check its state, but this will give you flexibility in a matter of user experience.

Please check the Uploadcare PHP repository for examples of dependency injections and configurations.

Thank you for your response. This makes sense and we will pursue the queue options.