Unable to use on my data with docker(lol)
BIOBRICK opened this issue · 5 comments
Hi there, I am using too-many-cells on mac to analysis my own data. Due to unable to install it with stack, I use docker instead.
My original data is 450MB, but no matter how much mem I give to docker, I can see all the mem are used up and then I will get an error without any log info (lol).
But when I use example files to run docker, I can correct results......
Are there any other reasons for such kind of failure?
How much memory do you have available? The only time we've seen no printed exception error is when docker decides to end the process due to lack of memory.
How much memory do you have available? The only time we've seen no printed exception error is when docker decides to end the process due to lack of memory.
I give it 14GB mem yet still failed.
The entire mem I hava is 16GB and I am using latest macos with docker-desktop.
Though I have to admit that the data was provided as a single .csv file and the quality was not that good, should I try to use cellranger-like data (3 files in a dir I mean. This can compress the data size)?
And I run into almost the same problem when I use vitual machine(ubuntu 18.04 server) instead of docker. I SUCCESSFULLY use too-many-cells to analizy part of my data (3 clusters picked). But when I use the entire data, it still said "unable to commit 10737418240". So I guess maybe we can't avoid the mem problem even in an actual machine rather than docker or vms.
The matrix implementation is memory hungry, so if you find that it ends after using all of the available memory, then that is the reason for the crash. If you are on a 16GB machine then I would use feature selection / filter out unneeded cells beforehand.
The matrix implementation is memory hungry, so if you find that it ends after using all of the available memory, then that is the reason for the crash. If you are on a 16GB machine then I would use feature selection / filter out unneeded cells beforehand.
Yeah, I guess so. Maybe I should do some test to decide how many cells I should leave.
However, thanks again for this tool and your patience, I used it and indeeed got some nice results, guess the tool will become more attractive if we can get a optimized version later.
:)
You can also use dimensionality reduction to limit the size of the matrix.