imgly/background-removal-js

Errors while removing background on deployed server

Rolstenhouse opened this issue · 11 comments

When I attempt to remove the background from a file on my server I get the following issue

corrupted size vs. prev_size
OR
free(): invalid size
OR
munmap_chunk(): invalid pointer

Have not been able to identify what triggers which, but feels like it might be an issue with the ML image (I'm using the small one)

Docker host: 20.10.12 linux x86_64
Node version: Node.js v21.6.2
Package version: 1.4.4

Same issue

Would you be open to share the minimal example as I cannot reproduce it.

Sure - here's some more context

Snippet

         mediaUrl = "https://api.twilio.com/2010-04-01/Accounts/ACfbfe2e1e70ce74b02a4151bf91b23693/Messages/MM3fa6329883117973ec3cd7b180c6caca/Media/ME76f45b7483238aac2516ab5429c5018a"
          try {
            ort.env.debug = true;
            ort.env.logLevel = "warning";

            logger.info("Removing background for image", { mediaUrl });
            const localPath = `file://${process.cwd()}/public/imgly/`;
            logger.info("localPath", { localPath });
            const blob: Blob = await removeBackground(mediaUrl, {
              publicPath:
                process.env.NODE_ENV === "production"
                  ? "file:///myapp/public/imgly/"
                  : localPath,
              // publicPath: "https://stickerfy.xyz/imgly/",
              debug: true,
              model: "small",
              progress: (key, current, total) => {
                logger.warn(`Downloading ${key}: ${current} of ${total}`);
              },
            });
            buffer = Buffer.from(await blob.arrayBuffer());
          } catch (error) {
            logger.error("Error while removing background for image", {
              mediaUrl,
              error,
              errorMessage: error.message,
              errorStack: error.stack,
              errorName: error.name,
            });
          }
        }
        
        // Write the buffer to S3
        
        if (buffer) {
          // Upload to S3
          logger.info("Uploading image to S3", {
            info: {
              key: mediaSid!,
              contentType: "image/png",
              userId: user?.autoId || 0,
              buffer: buffer.length,
            },
          });
          backgroundRemovedImage = await uploadImageToS3({
            key: mediaSid!,
            buffer,
            contentType: "image/png",
            userId: user?.autoId || 0,
          });
        }

Here's a screenshot of the logs (and included a CSV with the log output)
image

Also note: this snippet includes the local file path, but I also ran into this issue when referencing the hosted model.

Deployed server is running on fly.io btw (not sure if that might be an issue)

extract-2024-02-27T00_49_29.879Z.csv

My server is running on digital ocean with the same issue.
Droplet info:
Ubuntu 23.10 x64
Node 20
No gpu

Hi,
I get the same error on WSL2 (Ubuntu).

Could be related to "onnxruntime-node" or WASM and the fact that TensorFlow and the model need a GPU and on server or a remote environment they are not available ?

I noticed that in the source, the function:

async function createOnnxSession(model, config) {
  if (config.debug) {
    ort.env.debug = true;
    ort.env.logLevel = "verbose";
    console.debug("ort.env.wasm:", ort.env.wasm);
  }
}

on my WSL2 environment actually prints an empty object to the console:

fetch /models/medium 100%
ort.env.wasm: {}
free(): invalid size

Thanks!

onnxruntime-node should work without a GPU.
ort.env.wasm seems wrong and the Node Version does not yet support the wasm backend.
So it seems ok that it's empty.

I have no access to such a machine at the moment, so unfortunately I cannot reproduce the error.
Also, I have no idea what the cause is.

Thanks for looking into it. For other devs that might encounter this, I used a different package: rembg on replicate and just paid the small out of pocket cost.