bulik/ldsc

MemoryError: When running LDSCORE analysis

Opened this issue · 4 comments

I am getting memory error when generating LDscores;

./ldsc.py --bfile "/dataE/AWIGenGWAS/shared/imputed_data_plink/all_imputed_map_qc" --chunk-size 500000 --l2 --ld-wind-cm 1 --out Awi-gen-LDscores

Reading genotypes from /dataE/AWIGenGWAS/shared/imputed_data_plink/all_imputed
After filtering, 13976041 SNPs remain
Estimating LD Score.
Traceback (most recent call last):
File "./ldsc.py", line 620, in
ldscore(args, log)
File "./ldsc.py", line 316, in ldscore
lN = geno_array.ldScoreVarBlocks(block_left, args.chunk_size, annot=annot_$
File "/home/chebii/Awi-gen/FG_raw/boltlmm/ldsc/ldscore/ldscore.py", line 125$
return self.corSumVarBlocks(block_left, c, func, snp_getter, annot)
File "/home/chebii/Awi-gen/FG_raw/boltlmm/ldsc/ldscore/ldscore.py", line 189$
A = snp_getter(b)
File "/home/chebii/Awi-gen/FG_raw/boltlmm/ldsc/ldscore/ldscore.py", line 397$
X = np.array(slice.decode(self._bedcode), dtype="float64").reshape((b, nru$
MemoryError

Analysis finished at Thu Mar 21 09:06:42 2024

hello. how do you handle this problem in your research ? ? since I have this issue now. looking forward your reply.

@chirrie @aikedan Run ldsc on a machine with more memory available.

@chirrie Confirm with your HPC administrator that you are correctly requesting and receiving the requested amount of memory.

Also note that you should not change --chunk-size from the default value.