ablab/spades

'Failed to limit memory to' error on Mac M3 Max, while SPAdes runs normally via a virtual Linux machine on the same computer.

mkazanov opened this issue · 6 comments

Description of bug

The same FASTQ files are processed normally in a virtual Linux machine (Ubuntu 22.04.02 ARM64) on a Mac M3 Max with macOS Sonoma, but fail with a 'failed to limit memory' error when executed purely in macOS.

spades.log

spades.log

params.txt

params.txt

SPAdes version

v4.0.0

Operating System

macOS Sonoma 14.1

Python Version

3.9.6

Method of SPAdes installation

binaries

No errors reported in spades.log

  • Yes
asl commented

The error in log is completely unrelated to that warning.

First of all, starting from Monterey it is not possible to really limit the used memory in many circumstances unless one is not running under root. So, we demoted error and produce a warning here:

  0:00:00.000     1M / 16M   WARN    General                 (memory_limit.cpp          :  52)   Failed to limit memory to 250 Gb, setrlimit(2) call failed, errno = 22 (Invalid argument). Watch your memory consumption!

Going back to the error, it is an I/O error: SPAdes is unable to open one of its intermediate files. So, I'd suppose you to check system log & free disk space available.

The system has 4TB free space. Error is reproducible.
There were no any disk errors to this moment.
What do you recommend?

asl commented

The system has 4TB free space. Error is reproducible. What do you recommend?

Check system limits on number of open files? E.g. via ulimit -n? Usually it is very low on MacOS (as compared to Linux)

I've increased to 1,000,000 ulimit -S -n 1000000 but result is the same

asl commented

I've increased to 1,000,000 ulimit -S -n 1000000 but result is the same

Did it really increase the limit? Have you checked via ulimit -a? Note that you cannot increase soft limit beyond the hard one.

Yes, I've checked by ulimit -S -n.
The hard limit ulimit -H -n is unlimited.