ScriptTiger/Unified-Hosts-AutoUpdate

how to uninstall ?

alainib opened this issue · 11 comments

hello
i tryed this script on windows 10
after a restart it blocked completely my internet.

how to uninstall it please ?

Just run the script and tell it that you do not want to update and it will ask you if you want to remove it from your hosts file. After it runs, it will then ask if you want to view your hosts file, at which point you can open your hosts file to verify it is back to normal again.

Did you try compression level 9? Usually this solves the problem for most people experiencing extreme problems such as yours.

Also, as far as removing the task, same thing as above. When the script asks you if you want to run your current task, simply reject and it will then ask if you want to remove your current task. Everything can be done by running the script itself.

OT, but as I thought of this from this issue, I'm tempted to ask in relation last question.

Hosts_Update.cmd > tl;dr. but do you whitelist raw.githubusercontent.com in the script to at least rule out a "FP" on that domain, that could prevent updating the hosts file?

For other readers, the reason i cited "FP" is that there is hosted actually BadWare on Github that legally would put that domain on malware domain lists

Hosts_Update.cmd > tl;dr. but do you whitelist raw.githubusercontent.com in the script to at least rule out a "FP" on that domain, that could prevent updating the hosts file?

@spirillen, that's actually a good point, something like that would be possible in the future if raw.githubusercontent.com were to be added to one of the hosts lists or if someone manually added it to their custom list, then the script would essentially be blocking itself from future uses. However, right now raw.githubusercontent.com is not currently being blocked by any of @StevenBlack's lists.

For other readers, the reason i cited "FP" is that there is hosted actually BadWare on Github that legally would put that domain on malware domain lists

While this is possible, in most cases projects that contain malware link to it from their GitHub subdomains, like these:

www.financial-times-appp.github.io
financial-times-appp.github.io
www.superlogout.github.io
superlogout.github.io

Cross-site scripting or other means of linking to assets stored on raw.githubusercontent.com doesn't work out in many cases because GitHub already has some protections in place there. My script downloads assets stored from there, but it does not embed them into a Web page, which involves entirely different processes which GitHub already interrupts in many cases. GitHub's GitHub Pages feature allowing you to host <user>.github.io is designed for Web content delivery, while GitHub actively thwarts attempts to do so using raw.githubusercontent.com.

I have actually seen:

github.com
github.io
raw.githubusercontent.com
gitlab.io
gitlab.com
bitbucket.io
bitbucket.org

on malware lists from https://www.malwarepatrol.net/

But non the less, I'll guess the suggestion would simply be to do a check of the source url in the hsots file

# In bash on linux
if [ grep -F /etc/hosts "${source_domain}" ]
then
  sed '/${source_domain}/d' /etc/hosts
fi

⚠️ note: quick, might contains error, as not tested nor checked ⚠️

While I do love GitHub, I also don't aim to censor the list in any way, even if it does mean blocking GitHub. Since there is only the one domain used by my script (raw.githubusercontent.com), it would actually be quite easy to just do a quick check at the start of the script, as you suggest, @spirillen. If it really is blocked, it can just create a new temporary hosts file without that entry that the script can use while it's running, and then replace it with the updated list as usual at the end, which may or may not block it again.

I know @ohadschn will probably make a bunch of comments as to why that's completely unnecessary since the script could just make a cURL call or Wget call directly to the IP and set the host header to completely bypass DNS... But obviously we aren't going down that road again...

Also, the aforementioned ideas are just how I would implement it if it should become a problem. However, as I have also stated in the past, I am not looking to solve problems people don't actually have. So I'll wait for someone to actually report this as a real problem before implementing any solution for it.

@spirillen, @ohadschn has been helping to hone the script a bit with a lot of the core functionality that I'll be able to reuse again for your Unbound script, notably when it comes to handling downloads, error handling, debugging, and even logging. So hang in there and the future for your script is looking brighter each day with the continued improvement of this one.

And now that we have successfully and completely derailed this issue, @alainib, please let us know what's going on with you at your earliest convenience. As I said, just running the script itself offers you options to remove the scheduled task and also remove all of the blacklist entries from your hosts file. However, using compression level 9 usually helps solve most problems. So let us know what's going on, as we're eager to help resolve your issue.

Hello
I re runned the Last version.
I set compression to 8. Internet work but there is a 10secondes ''loading'' at first internet start.

What the compression do ?
Maybe adding only few entries, with most knowed bad site, on hosts file Will speed up.

I have ryzen 2600 and m.2 nvme disk.
Any way thanks for work

The "compression" is adding x number of domains per line within the hosts file rather than only one domain per line.

ex:
Compress 0

127.0.0.1 bnc.lt
127.0.0.1 custom.bnc.lt
127.0.0.1 dev.bnc.lt
127.0.0.1 epsilon.thirdparty.bnc.lt
127.0.0.1 thirdparty.bnc.lt
127.0.0.1 www.bnc.lt
127.0.0.1 h5.analytics.126.net
127.0.0.1 data-ero-advertising.com
127.0.0.1 www.data-ero-advertising.com
127.0.0.1 ero-advertising.com

Compress 3

127.0.0.1 bnc.lt custom.bnc.lt dev.bnc.lt
127.0.0.1 epsilon.thirdparty.bnc.lt thirdparty.bnc.lt www.bnc.lt
127.0.0.1 h5.analytics.126.net data-ero-advertising.com www.data-ero-advertising.com
127.0.0.1 ero-advertising.com

What the compression do ?

For more information on compression, please reference the following issue:
#23

Maybe adding only few entries, with most knowed bad site, on hosts file Will speed up.

If you want to try further slimming down your hosts file, I can tell you know the porn blacklist is by far the largest extension. Not blocking porn will free up a lot right there. When you are prompted Would you also like to block other categories?, reject this and try without any categories and see if you notice any improvement. You can also try experimenting with different combinations that suit your use case and see what works best for you.

@alainib, I am closing this issue since it sounds like everything is working for you, although you still need some tweaking for performance. We can continue to discuss your case on this issue even though it's closed, but everything after this point is just out of scope from the original topic issue.