Clinical-Genomics/cg

Catch Errors and continue in clean FASTQ command

Opened this issue · 1 comments

As a system-administrator,
I want the cleaning of FASTQ files to not stop is something goes wrong with a file,
So that one file does not block the cleaning of 100 000 others.

Work impact

Answer the following questions:

  • Is there currently a workaround for this issue? If so, what is it?
    • Yes, regularly look at the logs of the service and unclog it everytime it gets stuck
  • How much time would be saved by implementing this feature on a weekly basis?
    • ~10 h on yearly basis
  • How many users are affected by this issue?
    • sys-adm and production
  • Are customers affected by this issue?
    • No

Acceptance Criteria

  • One file failing does not block other files from being deleted
    ...

Notes

  • Additional information.
  • Dependencies.
  • Related user stories.

Technical refinement

The cleaning of fastq files is called here. We should implement error handling to continue the cleaning when one case fails. Also log properly the errors