[FR]: Delay before orphaned file
nodiaque opened this issue · 39 comments
Is your feature request related to a problem? Please elaborate.
When using a program like unpacker to unpack file and you enable orphan file, sometime, a timing issue can occur where unpacker is extracting/moving the file and qbitmanage cleanup the orphaned file. This make unpackered failed and the import in either sonarr or radarr fail.
Describe the solution you'd like
Implementing a delay timer where when an orphaned file is found, it wait x m/h/d before doing the orphaned file job.
OR
Do not move the orphaned file and wait the current timeout to delete it after (that would be good too).
OR
have a link into radarr/sonarr like unpackerr where it knows if the orphaned file is actually from a currently in process torrent.
Does your solution involve any of the following?
- New config option
- New command option
Describe alternatives you've considered
Disabled orphaned file, but then unpackerr extracted file get left behind
Who will this benefit?
Anyone using unpackerred or other solution
Additional Information
No response
qbit_manage/config/config.yml.sample
Line 240 in ee871b4
should prevent this as it excludes the unpackerred folder.....
I don't understand the need as this sounds like a config issue / not a needed feature?
Or are you running unpackerr against a random orphaned folder in your downloads directory?
If so - why is there a random folder in qbit's download dir? This sounds like a setup if you have random downloads in the same folder as all qbit downloads that is user-error.
Hello,
unpackerr is configured to run against radarr and sonarr download folder, nothing else. When it extract, it create an unpackerred folder during the extraction. Once the extraction is done, it move this file back to the original torrent folder (when the archive are). What happen is during the move, I get error that file has moved
Unpackerr log:
2023/06/24 21:49:12 Extraction Error: Your.Name.2016.MULTi.2160p.UHD.BluRay.x265-SHiNiGAMiUHD: reading path /downloads/movie-radarr/Your.Name.2016.MULTi.2160p.UHD.BluRay.x265-SHiNiGAMiUHD_unpackerred: open /downloads/movie-radarr/Your.Name.2016.MULTi.2160p.UHD.BluRay.x265-SHiNiGAMiUHD_unpackerred: no such file or directory
here's my config file, it does have the unpackerred exclusion so unsure why it get it
https://pastebin.com/xTTX2jSb
But another problem that also happen is when the file is moved into the torrent itself, sometime, qbitmanage does it's orphaned cleanup and find the freshly extracted file before sonarr/radarr has already begun processing them (like when download full season) and move the file before sonarr or radarr has finished copying all of them. Since I'm using unraid and using a specific disk for my download, I don't use hardlink (because of it's series, anime or movies, it's different array and none are the same as my download drive). I had this problem early, had 6 seasons of black mirror, only half were imported because qbitmanager orphaned file check moved them out of the torrent folder before the copy was done.
**/*_unpackerred
should do it
ah, that's not what's in the doc.
That clear the problem for unpacker, but not for sonarr and radarr (and other monitoring software like that). Since qbitmanager is tied to bittorrent and not any other software, it doesn't know that because the file is into the torrent folder, it shouldn't be moved. Maybe the orphaned scan could exclude any folder that have torrent target to it? I'm unsure how to explain my idea, but if Torrent A in Category TV-Sonarr, it should exclude it's download location from the scan.
yes the config file needed an update.
hardlinks have no relevance here other as far as QBm goes and nothing to do with orphaned data....but that does mean due to your poor setup that every seeding torrent uses double space....every packed torrent uses triple space (temporarily).....and all downloads are slow and thrash your hard drives so they die an early death.
How are you running or triggering QBm?
Sounds like you're running it with orphaned too frequently and thus it constantly interferes with downloads.....combined with your poor setup imports take forever as well
You say poor setup, I think we are entering an argument here that doesn't help anything and are beyond scope. I'm unsure if you know how unraid work, but my way of working as it's perk.
I only have 1 hdd that is used for all the download trash. This disk have nothing else on it so if it dies, there's no data loss (although it does have array protection from unraid). All my other share has there own drives in the array as unraid allow it. It's not a shitty setup and disk weariness is not an issue. Also, I don't get performance issue because my download hdd is getting a lot of ios because of the amount of seeding and downloading which would result in bad performance when watching stuff.
Also, I use a cache array of SSD. When stuff get moved to the right folder from sonarr/radarr, it goes through that cache array first, which is by design for unraid. Download disk doesn't use cache for read/write since it would kill the SSD. Then, a process call mover take what belong on the drive array on move it from the cache array.
If I was using a simple nas like truenas, using hardlink would yield good result. Here, using hardlink won't work because if I use hardlink, everything will be kept on the dowload drive instead of using the assign array drive from unraid (and using it's own set of rules on to where stuff goes).
The only other way, and even there it wouldn't work properly, would be to make the completed folder on the right drive array. But since the logic in these array are depending on the folder dept, this would break this. For instance, I could have season 1 on drive A, then season 2 on drive C. This is OK, it prevent multiple disk spinning for same season. But using the drive as a download folder, this information about the current place of the season is unknown to the downloader so unraid would just put it where it must go according to other rule. Thus, when the hardlink is done, it could give a season folder that reside on multiple drive, which is not the desired output.
As for every seeded torrent use double space, yes and no. All none archive torrent use double space, once on the download drive and once in the appropriate space. All archive drive use 1x space + 1 time extracted space at the final destination. There is a temporary double space after extraction, after copy to the destination and before orphan cleanup.
Everyone have double space usage on archive torrent since it need to be unpack. So the only space issue here in your explanation are non archive series, but there don't yield orphaned file thus aren't a problem.
Another way would be to move everything under 1 share instead of multiple share, and make a big hierarchie under it. A draw back of this is you need to put "split any directory" instead of a specific level only, which mean a season will have episode on different drive. Also, it means all your download are getting seed by the same drive that are getting read by user. I had by IO performance issue when I tried that in the past. I don't know your seeding and download time, but having multiple user watching 4k stuff while having 400+ torrent working in read/write scenario, there was a lot of problem. Since I moved the download out of the array, it's very fast. I'm even considering moving it completley out of the protected array cause I don't need any parity on this drive, it'st just a download drive. We all know that read and write can kill drive. This way, I protect all my array from seeding weariness because only 1 drive is constantly getting big IOs.
##################
Back to the problem.
File get extracted by unpacker. This file is then moved into the torrent folder path (which is the completed download folder from either sonarr, radarr or...). This is so sonarr (etc) can pick up the file then move it. Even in an hardlink scenario, if the file is moved from there to the orphaned file folder, sonarr and radarr won't have time to pick it up. It's all about timing. Because sonarr and radarr need to scan the folder first to catch the file, then issue the move/copy/create command on each file 1 by 1.
your hardlinks response indicates you have a lot to learn .... https://trash-guides.info/Hardlinks/How-to-setup-for/Unraid/ and https://trash-guides.info/Downloaders/qBittorrent/Tips/How-to-run-the-unRaid-mover-for-qBittorrent/
Please answer the other questions regarding how you're running QBm that were skipped.
Even in an hardlink scenario, if the file is moved from there to the orphaned file folder, sonarr and radarr won't have time to pick it up.
Don't have stats on the number of users if QBm...but no one has ever had this issue that you describe. It sounds like based on this that you're running removed orphaned for QBm every minute or every few minutes which doesn't make sense and there's no sane reason to be running removed orphaned that frequently.....or have consistently unlucky timing where QBm is always running in the exact minute or few that unpackerr is finished unpacking, Starr are refreshing queue , and trying to slowly copy all the data over.
I already posted my config file before all of this. What's the default setting for QBm? Cause that's what I'm running only with path defined, category and such. I might be missing something, but the only line I see is
empty_after_x_days: 7
Which doesn't say the frequency it run.
As for trashguides, yeah, I've read them many times. But explain me that.
How is having multiple drive always doing read/write IO because of download/seed + having the read of all the IO from the media server better then
Having 1 drive dedicated to read/write of download/seed
and
Having an array of drives dedicated for read of media server?
I understand the hard link, I did read on it and as I said, this require, per trash guide, to have everything under 1 share and set the share to split any directory as required. This is not something I want since I want all episode of a season on any series to be on the same drive as the next episode in the same season.
Trash guide make a lot of sense on raids array, or if you just put anything anywhere in your array. But when you start using directory splitting and such from unraid, trash guide doesn't take these into consideration.
If we are talking drive life, in the all array drive like trash, you have a higher chance of having a drive failure in the array since they are always working. In my scenario, only the non critical download drive will fail faster. Right now, it's been running 9 years without any problem. It is starting to show signs of failure. I consider 9 years for a non stop working drive very good, specially for a 100$.
Maybe also you never spin down your drive, I do. My drive are mostly dormant except for the download drive. they spin up when someone want something from this particular drive, which can be on any of the array.
Yes, the copy from this drive to the cache is slow. It is expected and not an issue cause I don't have critical stuff going on this drive, it's just downloads that are mostly automated. Even if the copy take an hour (which it doesn't), I wouldn't mind. But the copy to SSD is very fast. The copy to array from mover is slower since I'm using archive drive on these, because write speed isn't important for me at this level.
qbm is simply running off the docker container from unraid, connected to qbittorrent. Nothing trigger it, it's the built-in timer.
| Finished Run |
| Run Time: 0:00:02 |
| Current Time: 23:37 | 30 Minutes until the next run at 00:07
Try adding **/*_unpackerred/**
to the exclusion list
Hello, no because this is not the problem. The problem isn't with the unpackerred folder but with the original folder. Also, unpacker folder are already excluded. I'm unsure if you get what is happening and how unpacker work so to be sure we understand each other, I'll explain it.
QBittorrent start a download from sonarr/radarr. Sonarr and Radarr monitor the torrent and the download folder, which is /downloads/movie-radarr and /downloads/tv-sonarr. These are the complete folders.
My qbittorrent configuration create 1 folder for each torrent.
The incomplete downloads are in /downloads/incomplete.
When sonarr and radarr have finished their import process, all torrent related files are moved to /downloads/movies and /downloads/TV (category folder)
Once the torrent is complete, unpacker is notified and wait 1 minute before starting any extraction that is needed. The extraction is done in the <torrent_name>_unpackerred folder. Once the extraction is done, the content of <torrent_name>_unpackerred is moved back in the original torrent folder.
Then, sonarr/radarr in it's update scan the folder and sees the extracted files, import then via copy to the respective library folder and then move the torrent content to the category folder.
The problem here is that while the extraction or while the copy, specially when there's multiple files (like a series in a rar), the cleanup scan the torrent folder in /downloads/movie-radarr or /downloads/tv-sonarr, sees files that doesn't belong in the folder and delete them even if it's a torrent that is not imported. Because of that, between the time the extraction process is done and the import is made by sonarr or radarr, it happen that some files are either missing (like it imported 5 out of 10 episodes) or some file get erased while it's extracting.
This is why an option where cleanup is skipped on torrent folder would be nice. I don't need to have a cleanup of the folder where a torrent currently live in it. For now, I turned off cleanup (which was one of the reason why I took this app in the first place) and I go into my /downloads folder once a month, delete all folder that are in there once I know that every torrent are now in the category folder.
Your paths looks like they'll result in slow IO intensive copies rather than atomic instant moves...simply don't run orphaned files or don't run QBm so often so it is constantly cleaning up your validly orphaned unpacked files
I
you also don't at all need QBm to clean up after unpacker after Starr imports...unpacker clean those up.
QBm is run also for tagging and other. Right now the cleanup was each hour.
Unpacker doesn't cleanup torrent folder once it's imported. If so, please guide me where to config that because all unpack files are left behind until I clean them up. It does cleanup the _unpackerred folder, but not the file that got move into the torrent folder. At least not the docker app in unraid. I see on the unpackerr website that there's cleanup, but I don't see any documentation on it.
https://unpackerr.zip/docs/install/unraid
My path aren't io intensive at all. Everything is under /downloads which are all on the same drive, which mean instant move. The only copy is the extraction and the copy to the library which is on another array. All the rest are on the same drive.
Even with hard link, something I read higher in the thread, this problem could happen with very big extraction. Like I said, the cleanup right now doesn't have any fail safe. Which leave everything to chance instead of just having a simple option that verify against QBittorrent to know is this really and orphaned folder? In fact, it already does that else it wouldn't know that said file is not orphaned. So why not just implement an option to have a check at the folder level and not at the file level? Folder exist and is mapped to a torrent? I skip it. Just that would solved everything (and even make the check faster cause it won't check all file 1 by 1 in the torrent folder to see if there's any orphaned file)
Unpacker doesn't cleanup torrent folder once it's imported. If so, please guide me where to config that because all unpack files are left behind until I clean them up.
Incorrect and clearly explained in the config file that is mandatory required reading and configuring for unpackerr users...also explained in the docs that you only skimmed
When the item falls out of the starr app queue, the extracted files are deleted.
https://unpackerr.zip/docs/introduction#starr-logic
the unraid install docs immediately link to the config docs -> https://unpackerr.zip/docs/install/configuration
Extracts are deleted this long after import, -1s to disable
thus if unpackerr is not cleaning up, you must have made the conscious choice to set it to not clean up files.
Folder exist and is mapped to a torrent? I skip it.
so then any folder relating to a torrent can contain any random files and theyll never be removed nor a user notified? That seems a very problematic use case when there is zero need for that functionality as the request is caused 100% by user configuration error.
Visit the unpackerr discord for support
that variable UN_DELETE_DELAY
is not valid and does nothing nor impacts starr apps at all
the correct format is UN_{APPNAME}_0_DELETE_DELAY
https://github.com/Unpackerr/unpackerr/blob/main/examples/docker-compose.yml
I can't say, I'm using the official unraid template from unpackerr. Might be an old template cause I've had it for a while. I'll modify it with the one from the configuration page at
https://unpackerr.zip/docs/install/configuration
Ok, so with the new parameter for unpackerr, it does clear the file but it let empty folder in it. Normally, sonarr/radarr would clear it but since when they do the file move, it move only the files from the torrent, the folder isn't empty. The cleanup from unpackerr is run after the import is coimpleted. So I guess that's a request for unpackerr then.
Radarr & Sonarr never move torrents that are not marked as complete and done seeding (I.e. goal hit) that will always hardlink or copy
unpackerr always cleans up the unpacked files after Starr imports them. It will not cleanup the rars nor should it since it would cause Hit and Runs
Again, for unpackerr support visit unpackerr's discord
I badly express myself. Sonarr change the category in qbittorrent when the importation is terminated which then trigger a move of the file (because finished category aren't in the same folder as download folder).
I know unpackerr won't touch the rar or other file, and it's ok, that wasn't the "problem".
So in the end, I'll simply not use the clear orphaned file since nothing want to be done on the "check if the file is imported" and I'll ask on the unpackerr cleanup if they can check for empty folder and delete it.
@bobokun usecase is unpackerr is still unpacking or Starr is still importing
based on some additional comments by OP, suggested features:
- ignore a specific list of categories for orphaned files
- Add some sort of delay
- Communicate with Starr and don't orphan torrents not imported (unlikely and out of scope)
Would add use case Starr app are faulty, broken, shutdown or any other thing that would make import not working. Thus file would be unpacked but never imported before qbitmanage is ran.
Outside of a category, other thing that could be done is list the current torrent folder used by torrent and do not clean these. If you want to make it better, do not clean a download folder that isn't in the imported category. If it's still in the download category, consider as incomplete and wait for it to be moved to complete one (I think it's called library in qbitmanage scenario, I need to recheck the config file jargon).
Goal here is to make the program more robust. Right now, we are simply hoping for the best aka everything is working perfectly. But computer could be heavily in used, qbitmanage could be restart and ran right away, Starr app could failed, etc.
I think having the program with more fail safe application isn't a bad thing.
To add my own perspective:
qbit_manage does a lot of things, and some things are more time-sensitive than others. I have it configured to run very often so I can get the labels and categories where I want them as soon as I can. However, this means other (perhaps not-as-critical items, like orphan checks) also happen very frequently. Even if I accept running it less often, there is still a chance of a race condition if things line up just right.
As described earlier, the issue isn't the unpackerr working folder directly (which can be excluded as noted), but the final unpacked file, which is put in the original download folder with no additional context - it's not put in a subfolder with the _unpackerred
name, nor does it take that as part of its name. Sonarr/radarr need the file to import, but if manage comes along and removes it because it's not in the torrent list (and it's impossible to craft an exclude rule), then you get a corrupt import, or the import can just not happen at all because it was cleaned before sonarr/radarr can get to it, which stalls everything. This can happen any time manage is run while an import is pending - running more frequently just makes it more likely to happen.
I like the idea of delaying the cleanup of orphans from the original folder, but given manage is pretty nonstateful I can also see it being a challenge to implement easily.
I really want to be able to clean orphan data - in the past I've used ruTorrent, and it was mostly good at cleaning up after itself, but still missed things. I'm pretty sensitive to unwanted items taking up space, and the orphan cleanup would help me immensely. However, the risks with unpackerr (or other similar utilities that need the active folders as working directories) make it unfeasible right now.
A very simple solution for that would be to ignore cleanup of torrent in a specific category. Like if my torrent is still in the download category, exclude the torrent path from orphaned. This way, once it's moved category from the starr apps, when the download folder is scanned, it will find it. This is the safest way, no need to communicate with any starr apps.
I skimmed through this issue and believe a lot has been said so I hope I'm not making any wrong assumptions here.
I also encounter this issue for files that are successfully unpacked by unpackerr but that haven't been imported by an *Arr app, and as such not removed yet by unpackerr.
At first sight this might seem to indicate that I'm running QBM too often, but it's set to the default 30 minutes.
The problem lies in the fact that my Sonarr instance will not import a file as long as the episode has a 'TBA' episode title.
This means the file could be left orphaned for a while until I or someone else assign a title to the TVDB entry (+caching time) or I force the import myself, which might be a few days.
I see two possible solutions:
- Unpackerr should have a config property to leave a suffix on the unarchived file (_unpackerr.) which we could ignore in QBM, or even leave the unpacked file in the /unpackerr/ subfolder.
- QBM should have a config property for a delay on orphaned files based on created/modified date, so that we can minimize this situation.
It could also be useful for situations where you're copying/moving files manually where QBM could accidentally remove files.
The first option seems to be the most logical, but I'm not sure if that could impact other flows.
But the second option wouldn't hurt and be useful for other situations.
Thanks for your effort and support!
@NGDM You can set sonarr to import TBA, it's a setting. But it does create file with name "TBA" if you use the episode name in the file name. I have series that are TBA since many years so I had to enable it on my end.
QBM has an ignore list in which you can put unpacker. In fact, it's already discussed in the thread on how to do it. Unpacker already add a _unpackerr at the end of extract folder and QBM ignore it.
Simple option would be for QBM to check if it's still in the download category in QBittorrent since it's already using that. If the torrent is still there, do not clean. Not just check if a file doesn't belong to a Torrent in that folder but explicitly skip the folder if there's an active torrent in the download category in there. This doesn't require any connection to any arr apps and since the connection to check each fiile of the torrent is already done, wouldn't be to hard.
At first sight this might seem to indicate that I'm running QBM too often, but it's set to the default 30 minutes.
The default was changed from 30 to 1440 because the checker kept deleting unpackerr files. I'd suggest using the default, not 30 minutes.
If you put the exclusion in place, the _unpacker folder won't be delete. But, the problem is probably with the fact the file, after unpacker has finished, is copied into the torrent folder so *arr can do the job. This is when the cleanup appear. Just skip download category and everything will be fine whatever your config is.
A very simple solution for that would be to ignore cleanup of torrent in a specific category.
You can do so already by adding the category's root folder to the list of exclusions.
Doing so remove the cleanup totally which then make it useless since that's what we are trying to clean.
Are the categories you use for pre/post import from *arrs the same root folder? If you have pre-imported download in /movies/pre-import and all the movies that are processed stored in movies/post-import. Then you can exclude just the pre-import folder.
My structure is as follow:
downloads/completed/Unsorted/movie-radarr | tv-sonarr ==> This is the QBittorrent complete folders for sonarr and radarr import based on the category set by *arr download category, using auto file management. It's also where unpackerr run.
downloads/completed/movie | tv ==> This is the *arr folders used once the torrent is imported by the *arr software, moved by category used by *arr completed import category
downloads/incomplete/torrent ==> Currently downloading incomplet torrent
qbit config is:
cross_seed:
root_dir: /downloads
remote_dir: /downloads/
orphaned_dir: /downloads/orphaned_data
#torrents_dir:
#recycle_bin:
torrents_dir:
cat:
# Category & Path Parameters
# <Category Name> : <save_path> # Path of your save directory.
Movies: /downloads/completed/movies
TV Show: /downloads/completed/tv
unsorted: /downloads/completed/unsorted/
Now, the place where cleanup is needed is under downloads/completed/Unsorted/*
File get extract by unpackerr in that folder too. So there's multiple problem that can happen.
1rst: The cleanup might run while it's unpacking and delete the unpacker folder. This is easily fixed with
exclude_patterns:
- '**/_unpackerred'
- '**/*_unpackerred'
- '**/*_unpackerred*'
2nd: After the unpacker extraction, the file are copied back into the torrent folder and the unpacker folder is deleted (by unpacker). *Arr must then be notified for import which isn't always immediate. Between the end of extract, notification and import, the cleanup can run and delete these extracted file and thus create failed import. Even putting 24 hours delay won't solve this since the 24h could arrive at the time of import on busy server (mine is constantly importing, downloading, extracting, etc).
3nd: During the import, the cleanup can start, deleting file being deleted while in import queue thus creating a fail import.
The cleanup currently monitor the folders, check if the folder belong to a torrent or category. If it's doesn't belong to a torrent or category (let's forget exception), it delete it. If it's a torrent folder, it enter and list the torrent files/folders vs what exist on the disk and delete everything that doesn't belong to the torrent. Adding an exclusion to not cleanup a folder part of an active torrent in a particular category solved all of that. It will still clean folder and files that doesn't belong to anything in the folder while keeping stuff it shouldn't touch secure.
Excluding the root_dir won't solved, it will just make it do nothing like disabling the feature. If I change the root_dir to the download folder, it won't change anything since it's already where the problem is.
The goal here for the cleanup is:
After *arr app did it's job of importing the files and changing the category, then QBittorrent auto-file management move the torrents file to the new category path which qbitmanage know from config, the remaining files/folders from extraction (or anything else) need to be cleanup.
I fail to see how excluding "/downloads/completed/Unsorted/" doesn't resolve this issue.
I have never heard of any users having unpacked rar files sitting for over 24 hours due to load - perhaps you need to do less or upgrade to a beefier system rather than request workarounds for edge cases
I guess a Xeon Gold W-2275 with 128 GB is not beefy enough? I just imported a 1.5TB torrent, you think it took 5 minutes?
And sitting 24 hours? I think you failed the part where I said it's running 24/7 not waiting 24 hours. When will be that 24h tick? at 5:15am? at 8:26pm? You don't know cause it goes with when it was run last time, which can change because of backup taking longer (which restart the docker and could also trigger it than 2 time in the same day), docker update restart (which will also trigger another run in the same day) or any other restart that can happen. And at that time, is it safe to say there's no chance there's an extraction currently being imported? No. Are big are the chance that it happen? This require many parameters and change on a user to user and system to system basis.
And don't forget qbitmanage do way more than cleanup. It was the reason why I installed it and I've been running it since I created this thread without cleanup to have the other features.
Relying on "timing" to say it's "safe" while there's an easy fix that make it totally safe, why not? There's already exclusion rules, category management, connection to qbittorrent to have torrent files information. It's not hard to add something for it that if the category have a setting like active torrent, it doesn't clean the folder with active torrent.
But I guess making it more robust and less prone to error isn't a good thing. Prefer that sometime, errors are done negatively.
Oh and I did ask unpackerr if they could make it delete folder. They made the maximum and stopped at "if I go the last mile, we might have error sometime where it delete the torrents files". This is a great reason to not go further, security and safety. Making the software safe in all scenario, not just those we think are the right one.
I fail to see how excluding "/downloads/completed/Unsorted/" doesn't resolve this issue.
I really don't get what you don't get. I'm sorry, I'm trying to understand what is not clear.
The issue is I need the folder "/downloads/completed/Unsorted/" to be cleaned of orphan file. This is where stuff get extracted and imported. There is nowhere else orphaned file that are created, this is where cleanup is needed. So by excluding this folder, I remove the only folder that need cleanup from the cleanup. So what does it do to exclude it? Well, it does the same as putting cleanup: no, it disable it cause there's never going to be a cleanup anywhere else. And by the samething, making it useless since it's the first reason why cleanup was needd, to clean that folder.
- it can be just specific to starr pre-import category
- This is really an x-y problem at this point. Why do you constantly have orphaned data in your pre-import folders? What is causing this problem?
the star pre-import category folder is the one that need cleanup. The cleanup is needed because of unpacker files and folder that get left behind. Unpacker sometime let some folder behind that cannot be erased. Cleanup would need to clean these, in the import folder since they are left into the torrent folder.
The pre-import category is the download folder. It's the unsorted folder.
Torrent get put in the "incomplete/torrent/ folder. Once complete, it move to completed/unsorted/* folder base on category (by qbittorrent). This is where they get extracted and imported. This is where orphaned files/folder get left behind.
Unpacker sometime let some folder behind that cannot be erased.
This is an unpackerr problem to solve with unpackerr. nothing to do with qbm.
It does not make sense to add an entire set of logic to handle your specific individual unique use case.
The cleanup is needed because of unpacker files and folder that get left behind.
You should jump in my Discord so we can figure out what's causing things to be left behind. https://golift.io/discord