macs3-project/MACS

hmmratac adjust means and standard deviation

LinearParadox opened this issue · 5 comments

Hi all,

When running hmmratac for one of my samples, using --cutoff-analysis-only, I get this message:

"ValueError: adjust --means and --stddev options and re-run command."

I'm not sure if this is intended behavior or not. Is it possible to get more clarity on this error message, such as why this is happening, and hopefully some general guidelines?

Hi @LinearParadox, can you provide the full command you used?
Also, did you have a chance to read the help message provided from running the command: macs3 hmmratac -h?:

Thanks

The command I used was:
macs3 hmmratac -b $n --outdir $DIR -n $name --cuttoff-analysis-only

I did read the help, and it is a little helpful. However, I'm not quite sure how to go about diagnosing whether it is the means or the Stdevs that are the issue, and I don't want to change values at random.

Thanks @LinearParadox, can you please also send the full log/ runtime message associated with this error? It looks like the process for detecting the means/stdevs for fragment sizes failed, but the log might help explain why.

Here is all it gives me:

INFO @ 08 Sep 2023 07:23:41: [106 MB] #1 Read fragments from BAM file...
INFO @ 08 Sep 2023 07:23:44: [140 MB] 1000000 fragments parsed
INFO @ 08 Sep 2023 07:23:46: [145 MB] 2000000 fragments parsed
INFO @ 08 Sep 2023 07:23:49: [153 MB] 3000000 fragments parsed
INFO @ 08 Sep 2023 07:23:51: [160 MB] 4000000 fragments parsed
INFO @ 08 Sep 2023 07:23:54: [168 MB] 5000000 fragments parsed
INFO @ 08 Sep 2023 07:23:56: [176 MB] 6000000 fragments parsed
INFO @ 08 Sep 2023 07:23:59: [184 MB] 7000000 fragments parsed
INFO @ 08 Sep 2023 07:24:02: [192 MB] 8000000 fragments parsed
INFO @ 08 Sep 2023 07:24:04: [199 MB] 9000000 fragments parsed
INFO @ 08 Sep 2023 07:24:07: [207 MB] 10000000 fragments parsed
INFO @ 08 Sep 2023 07:24:09: [215 MB] 11000000 fragments parsed
INFO @ 08 Sep 2023 07:24:12: [223 MB] 12000000 fragments parsed
INFO @ 08 Sep 2023 07:24:15: [231 MB] 13000000 fragments parsed
INFO @ 08 Sep 2023 07:24:17: [238 MB] 14000000 fragments parsed
INFO @ 08 Sep 2023 07:24:20: [246 MB] 15000000 fragments parsed
INFO @ 08 Sep 2023 07:24:22: [254 MB] 16000000 fragments parsed
INFO @ 08 Sep 2023 07:24:25: [261 MB] 17000000 fragments parsed
INFO @ 08 Sep 2023 07:24:28: [269 MB] 18000000 fragments parsed
INFO @ 08 Sep 2023 07:24:30: [276 MB] 19000000 fragments parsed
INFO @ 08 Sep 2023 07:24:33: [284 MB] 20000000 fragments parsed
INFO @ 08 Sep 2023 07:24:36: [292 MB] 21000000 fragments parsed
INFO @ 08 Sep 2023 07:24:38: [299 MB] 22000000 fragments parsed
INFO @ 08 Sep 2023 07:24:41: [307 MB] 23000000 fragments parsed
INFO @ 08 Sep 2023 07:24:43: [315 MB] 24000000 fragments parsed
INFO @ 08 Sep 2023 07:24:46: [322 MB] 25000000 fragments parsed
INFO @ 08 Sep 2023 07:24:49: [331 MB] 26000000 fragments parsed
INFO @ 08 Sep 2023 07:24:51: [338 MB] 27000000 fragments parsed
INFO @ 08 Sep 2023 07:24:54: [346 MB] 28000000 fragments parsed
INFO @ 08 Sep 2023 07:24:56: [354 MB] 29000000 fragments parsed
INFO @ 08 Sep 2023 07:24:59: [362 MB] 30000000 fragments parsed
INFO @ 08 Sep 2023 07:25:01: [370 MB] 31000000 fragments parsed
INFO @ 08 Sep 2023 07:25:04: [377 MB] 32000000 fragments parsed
INFO @ 08 Sep 2023 07:25:07: [385 MB] 33000000 fragments parsed
INFO @ 08 Sep 2023 07:25:09: [392 MB] 34000000 fragments parsed
INFO @ 08 Sep 2023 07:25:12: [400 MB] 35000000 fragments parsed
INFO @ 08 Sep 2023 07:25:14: [408 MB] 36000000 fragments parsed
INFO @ 08 Sep 2023 07:25:17: [415 MB] 37000000 fragments parsed
INFO @ 08 Sep 2023 07:25:20: [422 MB] 38000000 fragments parsed
INFO @ 08 Sep 2023 07:25:22: [431 MB] 39000000 fragments parsed
INFO @ 08 Sep 2023 07:25:25: [439 MB] 40000000 fragments parsed
INFO @ 08 Sep 2023 07:25:27: [444 MB] 41000000 fragments parsed
INFO @ 08 Sep 2023 07:25:30: [454 MB] 42000000 fragments parsed
INFO @ 08 Sep 2023 07:25:33: [462 MB] 43000000 fragments parsed
INFO @ 08 Sep 2023 07:25:35: [469 MB] 44000000 fragments parsed
INFO @ 08 Sep 2023 07:25:38: [477 MB] 45000000 fragments parsed
INFO @ 08 Sep 2023 07:25:40: [481 MB] 46000000 fragments parsed
INFO @ 08 Sep 2023 07:25:43: [492 MB] 47000000 fragments parsed
INFO @ 08 Sep 2023 07:25:46: [501 MB] 48000000 fragments parsed
INFO @ 08 Sep 2023 07:25:48: [508 MB] 49000000 fragments parsed
INFO @ 08 Sep 2023 07:25:51: [517 MB] 50000000 fragments parsed
INFO @ 08 Sep 2023 07:25:53: [524 MB] 51000000 fragments parsed
INFO @ 08 Sep 2023 07:25:56: [532 MB] 52000000 fragments parsed
INFO @ 08 Sep 2023 07:25:59: [541 MB] 53000000 fragments parsed
INFO @ 08 Sep 2023 07:26:01: [548 MB] 54000000 fragments parsed
INFO @ 08 Sep 2023 07:26:04: [553 MB] 55000000 fragments parsed
INFO @ 08 Sep 2023 07:26:06: [563 MB] 56000000 fragments parsed
INFO @ 08 Sep 2023 07:26:09: [568 MB] 57000000 fragments parsed
INFO @ 08 Sep 2023 07:26:12: [580 MB] 58000000 fragments parsed
INFO @ 08 Sep 2023 07:26:14: [587 MB] 59000000 fragments parsed
INFO @ 08 Sep 2023 07:26:17: [595 MB] 60000000 fragments parsed
INFO @ 08 Sep 2023 07:26:19: [604 MB] 61000000 fragments parsed
INFO @ 08 Sep 2023 07:26:22: [680 MB] 61925929 fragments have been read.
INFO @ 08 Sep 2023 07:26:59: [680 MB] #2 Use EM algorithm to estimate means and stddevs of fragment lengths
INFO @ 08 Sep 2023 07:26:59: [680 MB] # for mono-, di-, and tri-nucleosomal signals...
INFO @ 08 Sep 2023 07:26:59: [680 MB] # A random seed 10151 has been used in the sampling function
INFO @ 08 Sep 2023 07:27:03: [795 MB] # Downsampled 2808149 fragments will be used for EM training...
ValueError: Adjust --means and --stddev options and re-run command

During library construction, do you have any filtering options set for certain fragment sizes of DNA? If so, what filters are set?

You could try setting different values of --means and --stddevs, for example: --stddevs 50 50 50 50, the defaults are set to: --means 50 200 400 600 and --stddevs 20 20 20 20.

If you are still experiencing this error, try skipping the EM step with the --no-fragem option.