Rogue Element identified in new blocklist
shown19 opened this issue ยท 18 comments
Hi noob question, may I know why I'm getting this error, what does rogue element means?
Rogue element: '447216: local=/antonio' identified in new blocklist.
By the way, I'm using hagezi blocklist. The rest of his blocklist are working but if I include the threat intelligence feed dnsmasq link provided, I got that error. Is this a compatibility issue?
The message must come from another block list, my lists do not contain local=
:
curl -sL https://gitlab.com/hagezi/mirror/-/raw/main/dns-blocklists/dnsmasq/tif.txt | sed -n '447216p'
address=/inov2elate.com/#
@hagezi Wow, I get it now, coz I'm also using your hagezi pro so maybe this is the reason, right? it all make sense now. Thank you and for your fast response, appreciate it. :)
The Pro also does not contain a local=
element.
Hey @hagezi, plenty from the OpenWrt community now use one or more of your excellent lists, and this little 'adblock-lean' service script for OpenWrt has proved fairly popular - see this post on the OpenWrt forum. Thank you for maintaining the lists!
I don't know if the address=/
elements are converted to local=/
during import and something goes "wrong".
@shown19 a good suggestion was put forward here. Namely it could be that the head -c
call here is truncating your blocklist file part download. Is the blocklist file part larger than 20MB? If so, consider increasing the following value in your config file:
# Maximum size of any individual downloaded blocklist part
max_blocklist_file_part_size_KB=20000
The Pro also does not contain a
local=
element.
@hagezi weird, but I only used your lists
@lynxthecat I set max_blocklist_file_part_size_KB from 20000 to 50000 to 100000 just to test but still got an error unless i remove the threath intelligence feed and it's working again.
this is the list I am using:
blocklist_urls="https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/pro.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/doh-vpn-proxy-bypass.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/dyndns.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/hoster.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/native.amazon.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/native.apple.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/native.huawei.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/native.winoffice.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/native.tiktok.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/native.tiktok.extended.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/native.lgwebos.txt
https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/tif.txt --> successful if this is remove
restarting adblock-lean outputted this, maybe there's a hint here?
Downloading new blocklist file part from: https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/tif.txt.
Download of new blocklist file part from: https://raw.githubusercontent.com/hagezi/dns-blocklists/main/dnsmasq/tif.txt suceeded (downloaded file size: 28162 KB).
Cleaning whitespace and formatting blocklist file part as local=/.../.
sed: write error
Successfully generated preprocessed blocklist file with 465097 line(s).
Processing and checking new blocklist file.
Preprocessed blocklist file size: 13080 KB.
Removing duplicates from blocklist file.
Duplicates removed.
Found local allowlist with 38 lines. Removing (sub)domain matches from blocklist.
Removal of allowlist (sub)domain matches from blocklist complete.
Checking for any rogue elements.
Rogue element: '447847: local=' identified in new blocklist.
New blocklist file check failed.
Yes, that will be the problem, the TIF is ~30MB
but how come I still got an issue even though I already set max_blocklist_file_part_size_KB=100000 ? or does it uses disk space? router has still alot of RAM it's 512mb variant.
Edit: sorry, I thought I am replying to @lynxthecat , I got confused here already. So basically I set it to 100000 but still getting the issue.
This looks problematic:
sed: write error
This looks problematic:
what could be the possible cause of this?
Are you sure you have sufficient available free memory (try with free -m)?
@lynxthecat I think so, here's the output:
root@Openwrt:~# free -m
total used free shared buff/cache available
Mem: 444464 52924 186108 180984 205432 173760
Swap: 0 0 0
@dave14305 any ideas here? Does busybox sed have a file size or limit?
@shown19 could you perhaps try the full version of sed: opkg install sed
?
/tmp is generally only half the size of free memory. Probably just too many lists.
/tmp is generally only half the size of free memory. Probably just too many lists.
you're right, after opkg install sed as suggested by @lynxthecat , I got this output this time with more details:
sed: couldn't write 28 items to stdout: No space left on device
could I still expand the /tmp though?
Anyway, if it's complicated then I will not be forcing this TIF list and I think this other lists of hagezi are enough for my needs already so thank you very much guys for all the help @lynxthecat @dave14305 @hagezi
You're welcome. I think I might try to add a check for truncation based on the head -c
call.
I think I might try to add a check for truncation based on the head -c call.
Done: e7f1162.