Nesvilab/FragPipe

Error when searching peptidomics data (not HLA)

Closed this issue · 3 comments

- Upload your log file
In total 28088779 peptides.
Generated 124107358 modified peptides.
Number of peptides with more than 10000 modification patterns: 0
Selected fragment index width 0.13 Da.
9784757778 fragments to be searched in 1 slices (145.80 GB total)
Operating on slice 1 of 1:
OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x0000021d7e000000, 139519328256, 0) failed; error='The paging file is too small for this operation to complete' (DOS error/errno=1455)

There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (mmap) failed to map 139519328256 bytes for Failed to commit area from 0x0000021d7e000000 to 0x0000023dfa000000 of length 139519328256.

An error report file with more information is saved as:

D:\21079\MSF10kDa\split_peptide_index_tempdir\5\hs_err_pid45652.log

Traceback (most recent call last):
File "C:\Users\protolab\Downloads\FragPipe-jre-20.0\fragpipe\tools\msfragger_pep_split.py", line 630, in
main()
File "C:\Users\protolab\Downloads\FragPipe-jre-20.0\fragpipe\tools\msfragger_pep_split.py", line 616, in main
run_msfragger(calibrate_mzBIN if calibrate_mass in [1, 2] else infiles_name)
File "C:\Users\protolab\Downloads\FragPipe-jre-20.0\fragpipe\tools\msfragger_pep_split.py", line 133, in run_msfragger
subprocess.run(list(map(os.fspath, cmd)), cwd=cwd, check=True)
File "C:\Users\protolab\Anaconda3\lib\subprocess.py", line 528, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['C:\Users\protolab\Downloads\FragPipe-jre-20.0\fragpipe\jre\bin\java.exe', '-jar', '-Dfile.encoding=UTF-8', '-Xmx585G', 'C:\Users\protolab\Downloads\MSFragger-3.8\MSFragger-3.8\MSFragger-3.8.jar', 'fragger.params', 'D:\21079\HF2_GA_21079_10_YP_5_FLVZ_1_LAP_04042024.raw', '--partial', '5']' returned non-zero exit status 1.
Process 'MSFragger' finished, exit code: 1
Process returned non-zero exit code, stopping

Cancelling 14 remaining tasks

I'm trying to process a Thermo .raw file of peptidomics data. It keeps crashing. I can share the raw file via Email.
I don't understand what the error says. Data was searched locally on the PC.
Thanks,

Yishai



fcyu commented

Hi Yishai,

It is due to some memory errors. Could you increase the splits of database and try again?

Thanks,

Fengchao

fcyu commented

Hi Yishai,

Thanks for the update. I will close this issue now, but feel free to contact us if you have any questions in the future.

Best,

Fengchao