SpaceinvaderOne/Unraid_ZFS_Auto_Dataset_Update

Version2 - Skipping folder due to insufficient space

Opened this issue · 7 comments

I am using version2 and when doing a dry run i am getting these errors.

Script location: /tmp/user.scripts/tmpScripts/ZFS_Auto_Dataset_Update/script Note that closing this window will abort the execution of this script Checking if anything needs converting Source cache_appdata/appdata is a dataset and valid for processing ... Folders found in cache_appdata/appdata that need converting... Checking Docker containers... Processing folder /mnt/cache_appdata/appdata/DiskSpeed... Folder size: 1.0K Skipping folder /mnt/cache_appdata/appdata/DiskSpeed due to insufficient space Processing folder /mnt/cache_appdata/appdata/binhex-krusader... Folder size: 71M Skipping folder /mnt/cache_appdata/appdata/binhex-krusader due to insufficient space Processing folder /mnt/cache_appdata/appdata/clamav... Folder size: 1.0K Skipping folder /mnt/cache_appdata/appdata/clamav due to insufficient space Processing folder /mnt/cache_appdata/appdata/speedtest-tracker... Folder size: 372M Skipping folder /mnt/cache_appdata/appdata/speedtest-tracker due to insufficient space The following folders were successfully converted to datasets:

However if I try version1 the issue is not there.

Same issue as above... reverted to v1 and it works fine...

Same problem, using v1 for now.

Looks like the issue is line 334 which contains:

zfs list -o name | grep -q "^${source_path}/"

the trailing slash in the grep expression is the issue, as the output of the zfs list will give things like "cache/appdata", not "cache/appdata/". I haven't dug into the code, so not sure if Ed had something else in mind with that trailing slash.

Same issue, except V1 doesn't work for me, no folder is converted without any error message...

I had the same problem and solved it with this PR: #8

I had the same problem and solved it with this PR: #8

Awesome, thanks! Worked for me too.

I had the same problem and solved it with this PR: #8

Thanks! That fixed if for me too.