You are not logged in.
Glad to hear you're back on track, still using hostsblock and it's still working flawlessly.
PS: Hope the baby's alright.
Offline
The baby is all right. He got Hand, Foot, and Mouth Disease, and then gave it to me (which is very uncommon!) We're both well recovered.
Here's a bit of the changelog from the current pre-alpha up on github:
(07.25.2014)
PRE-ALPHA RELEASE
*new main config file: /etc/hostsblock/hostsblock.conf
*new command-line functions that overrides configuration file location and verbosity
*new verbosity levels:
Level 0: Only fatal errors
Level 1: Level 0 + non-fatal errors
Level 2: Level 1 + updates to cache files
Level 3: Level 2 + narration of each major phase
Level 4: Level 3 + step-by-step details of all processes
Level 5: Level 4 + stdout/stderr from sub-processes like curl, zip, 7za, etc
*functions common to hostsblock and hostsblock-urlcheck split off into library /usr/lib/hostsblock-common.sh
*hostsblock-urlcheck now lists which blocklists affect a given url
*logic to detect dns caching daemons (e.g. dnsmasq) and offer configuration instructions
*post-download processing of blocklist files now handled in parallel for greater performance
*general performance improvements in compiling large hosts.block files (use of more grep over sed)
*general performance improvements for hostsblock-urlcheck (use of more grep over sed)
TODO (looking for volunteers!):
*Write up man pages for hostsblock, hostsblock.conf, and hostsblock-urlcheck
*Write up systemd files (hostsblock.service and hostsblock.timer)
*Review code (looking especially for grep hackers...I use grep A LOT, and would like to simplify some of the instances out of performance/code readability concerns)
Check out hostsblock for system-wide ad- and malware-blocking.
Offline
thank you gaenserich for your work and others for helpful posts! works great.
i was wondering would i be able to add these lists to my setup
https://github.com/wiltteri/wiltteri
looks like they are in some adblock specific format but i don't know.
also - if i block a domain with hostsblock-urlcheck...what's the best way to apply the changes immediately? (dnsmasq & kwakd)
Offline
Lately, hostsblock hasn't been working for me. I just reinstalled my system for the second time in a month (don't ask), and on both fresh installs it fails to block advertisements on a large number of websites where it used to block everything.
I have followed the instructions on the GitHub page, but here is a short recap of what I have done:
1. Copy /etc/hosts to /etc/hostsblock/hosts.head.
2. Edit the configuration file to reflect not using a dns caching daemon per the instructions.
3. Run hostsblock as root.
What can I do to troubleshoot this?
If you can't sit by a cozy fire with your code in hand enjoying its simplicity and clarity, it needs more work. --Carlos Torres
Offline
Hi gaenserich, I just posted on the AUR page so forgive me if it's bad form to post here on the same topic.
I have a cron job that runs hostsblock once a week. I noticed that my hosts.block file has not changed since the end of July, so after investigating I figured out that the hostsblock script fails out if the curl command can't reach some domain. (In my case it is the ismeh.com domain). I solved my problem by commenting out that domain in the rc.conf - but when a domain isn't reachable shouldn't the loop continue processing the rest of the domains in the blacklist, and then finally update the hosts.block file, instead of failing out?
Thanks, this is great work that you've done.
Offline
^ That indeed was my issue! I commented out the ismeh.com domain, ran hostsblock and away my ads went.
If you can't sit by a cozy fire with your code in hand enjoying its simplicity and clarity, it needs more work. --Carlos Torres
Offline
Hi Unia, that's great to hear. In your case, since you reinstalled recently, you would have had no hosts.block file at all, thus all the ads etc.
Offline
Thanks setone!
I had been trying to get it to work and that was my problem too!
You can like linux without becoming a fanatic!
Offline
setone's suggestions helped and the files are being downloaded to var/cache/hostsblock and the log is var/log/hostsblock.log.
Do the hosts file get used from the previously mentioned location because etc/hosts doesn't show any changes except if sites are added individually with the 'hostsblock-urlcheck' command.
Offline
hello gänserich and contributors,
i have a small request: could you add some more exit codes to the script? i'd like to use it inside another script, and that needs to know if hostsblock completed (more or less) succesfully or if it failed.
i'd suggest 0=allgood, 1=allfail and a third code if not all hostslists could be downloaded/updated.
anyhow, thanks for a great script.
been using it for maybe a year now and it hasn't failed me yet. not needing adblock is great.
sometimes i thought hostsblock is blocking some serious surfing action, i could always whitelist the entry in question. firefox doesn't even have to be restarted for that to take effect. the few times that didn't help, the culprit was never /etc/hosts.
Last edited by ondoho (2015-01-12 07:07:04)
Offline
I noticed these extra code in hosts.block on 25-01-2015:
127.0.0.1 0a5671150.bestgarciniaarab.eu
127.0.0.1 <feff>bmgqde.remedialdrugsdeal.ru
127.0.0.1 <feff>cpwokx.bestrxwarehouse724.ru
127.0.0.1 <feff>ns1.copotu.ru
127.0.0.1 <feff>ns1.saletabletipad.com
127.0.0.1 <feff>ns2.wkimed.in
127.0.0.1 www.googleadservices.com
BTW --apologize if already answered I just couldn't find in this thread-- say I want to support a website; is adding domains it contains into the white list possible? ie. allow *site ŋ* to display ads? It is my understanding that `hostsblock-urlcheck` isn't the right tool for that. Quoted from Hostsblocks
Hostsblock also includes hostsblock-urlcheck, a command-line utility that allows you to block and unblock certain websites and any other domains contained in that website, in the event that the included blocklists don't block enough or block too little on a specific site.
EDIT
hostsblock does not populate /etc/hosts on a netbook where I used the simple (with kwakd and no dnsmask) config. Wonder what I did forget:
/etc/hostsblock/rc.conf
# /usr/bin/hostsblock
# ls -l /etc/hosts*
-rw-r--r-- 1 root root 431 27 janv. 15:56 /etc/hosts
-rw-r--r-- 1 root root 653 17 févr. 2013 /etc/hosts.allow
-rw-r--r-- 1 root root 281 1 mars 2012 /etc/hosts.deny
/etc/hostsblock:
total 20K
-rw-r--r-- 1 root root 18 11 avril 2013 black.list
-rw-r--r-- 1 root root 431 27 janv. 15:58 hosts.head
-rw-r--r-- 1 root root 7,2K 27 janv. 16:32 rc.conf
-rw-r--r-- 1 root root 235 11 avril 2013 white.list
ps glad both the baby and its dad recovered from a pain in the *ss disease
Last edited by kozaki (2015-01-27 15:57:13)
Seeded last month: Arch 50 gig, derivatives 1 gig
Desktop @3.3GHz 8 gig RAM, linux-ck
laptop #1 Atom 2 gig RAM, Arch linux stock i686 (6H w/ 6yrs old battery ) #2: ARM Tegra K1, 4 gig RAM, ChrOS
Atom Z520 2 gig RAM, OMV (Debian 7) kernel 3.16 bpo on SDHC | PGP Key: 0xFF0157D9
Offline
Links to AUR pages are broken (Error 404 - file not found ) and searching in packages for 'hostblock' returns 0 results. Has hostblock been removed from Arch repositories? Or renamed?
bing different
Offline
hostsblock...
Offline
Ahh.. Thanks.
(But the AUR link in the first post is broken nontheless.)
bing different
Offline
Fixed.
Offline
Fixed.
Not for me. I just tried the link toward the end of the OP and still get the 404 error. But when I search on the AUR site using hostsblock, the package is found.
But I didn't land here because of that. I ran across this project because I just finished making my own crude script to download and format a hosts file for ad blocking. I'm sure it's not nearly as effective as this one. But looking over this thread it seems to me like this project is going moribund, true? Just wondering because it's the type of utility that needs to be maintained--witness a couple of users who recently discovered that one of the host-file sites is down, which actually resulted in the program not running. True, it's an easy fix to simply comment out that URL. But it is an indication that maintenance on this project is lagging.
So, what's the status? Is this a short-term disruption of the project, or is this on its way to being abandoned? If the latter, hopefully someone will fork it or take over development.
Offline
jasonwryan wrote:Fixed.
Not for me. I just tried the link toward the end of the OP and still get the 404 error. But when I search on the AUR site using hostsblock, the package is found.
Correction. The link that gives me the 404 error is not the one for the hostsblock package on AUR. Rather, it's for the link associated with the comment "Package this up for inclusion in the AUR, or even merge with hosts_update (https://aur.archlinux.org/packages.php?ID=44930)" in that post.
Offline
It's not moribund, although its creator sometimes feels like it IRL. Other than a little fix-up on the legacy version I pushed out about a month ago, I haven't had much time to put into this. However, I have plans to pick it back up around the end of March.
Check out hostsblock for system-wide ad- and malware-blocking.
Offline
Greetings.
I'm using the .deb for hostsblock on Linux Mint 17.1 XFCE, have been using it for many years now. It's usually the first thing I install after a fresh build!
I have a few suggestions for improvements. I don't know if these are new issues, as I'm not a Git user, and not a programmer. If these have already been covered, please forgive me.
1. The current script fails whenever *any* download fails, instead of processing the successful downloads. The patch I use to "fix" this on my boxes:
--- hostsblock.old 2015-03-02 11:01:20.113283924 -0500
+++ hostsblock 2015-03-02 11:08:49.681278147 -0500
@@ -83,8 +83,9 @@
printf "no changes"
fi
else
- printf "FAILED\nScript exiting @ `date +'%x %T'`"
- exit 1
+ # printf "FAILED\nScript exiting @ `date +'%x %T'`"
+ printf "FAILED"
+ # exit 1
fi
done
2. The .deb package has p7zip as a dependency; on at least Ubuntu/Mint, p7zip includes 7zr only. p7zip-full provides 7za.
if which 7za &>/dev/null; then
zip7="1"
else
echo "Dearchiver 7za not found. URLs which use this format will be skipped."
zip7="0"
fi
3. I also usually move the link from /etc/cron.weekly to /etc/cron.daily, but I appreciate why this isn't the default.
Cheers
Darren
Offline
@gaeserich nice to hear.
Bump on a previous question.
Say one want to support a website: is adding domains it contains into the white list possible? ie. allow *site ŋ* to display ads? I believe '/etc/hostsblock/white.list' would allow the unblocked sites on every connection.
Seeded last month: Arch 50 gig, derivatives 1 gig
Desktop @3.3GHz 8 gig RAM, linux-ck
laptop #1 Atom 2 gig RAM, Arch linux stock i686 (6H w/ 6yrs old battery ) #2: ARM Tegra K1, 4 gig RAM, ChrOS
Atom Z520 2 gig RAM, OMV (Debian 7) kernel 3.16 bpo on SDHC | PGP Key: 0xFF0157D9
Offline
the new version does not update my hosts file.
it gives warning about
[WARN] FAILED to unzip hostsfile.org.Downloads.BadHosts.unx.zip.
[/var/cache/hostsblock/hostsfile.mine.nu.Hosts.zip]
for ALL files in /var/cache/hostsblock.
removing the whole /var/cache/hostsblock directory helps, i can then succesfully run hostsblock once.
upon rerunning it fails with the same errors, leaving /etc/hosts unchanged (and also creating /etc/hosts.old that is actually the same as /etc/hostsblock/hostsblock.head).
if i manually remove /etc/hosts, but don't empty the cache, the new hosts file contains only /etc/hostsblock/hostsblock.head's contents.
workaround: always delete /var/cache/hostsblock after running hostsblock.
here's the complete output of a failed run:
$ sudo hostsblock
[INFO] Auto-detecting dnscaching settings...
grep: /etc/dnsmasq.conf: No such file or directory
[INFO] DNSMASQ does NOT have the correct 'addn-hosts' entry.
grep: /etc/dnsmasq.conf: No such file or directory
[INFO] DNSMASQ does NOT have the correct 'listen-address' entry.
[INFO] DNSMASQ incorrectly configured.
[INFO] Checking blocklists for updates...
[INFO] Refreshed blocklist http://winhelp2002.mvps.org/hosts.zip.
[NOTE] CHANGES FOUND for blocklist http://winhelp2002.mvps.org/hosts.zip.
[INFO] Refreshed blocklist http://pgl.yoyo.org/as/serverlist.php?hostformat=hosts&mimetype=plaintext.
[NOTE] CHANGES FOUND for blocklist http://pgl.yoyo.org/as/serverlist.php?hostformat=hosts&mimetype=plaintext.
[INFO] Refreshed blocklist http://hosts-file.net/download/hosts.zip.
[NOTE] CHANGES FOUND for blocklist http://hosts-file.net/download/hosts.zip.
[INFO] Refreshed blocklist http://www.malwaredomainlist.com/hostslist/hosts.txt.
[NOTE] CHANGES FOUND for blocklist http://www.malwaredomainlist.com/hostslist/hosts.txt.
[INFO] Refreshed blocklist http://hosts-file.net/ad_servers.asp.
[NOTE] CHANGES FOUND for blocklist http://hosts-file.net/ad_servers.asp.
[INFO] Refreshed blocklist http://hosts-file.net/hphosts-partial.asp.
[NOTE] CHANGES FOUND for blocklist http://hosts-file.net/hphosts-partial.asp.
[INFO] Refreshed blocklist http://hostsfile.org/Downloads/BadHosts.unx.zip.
[NOTE] CHANGES FOUND for blocklist http://hostsfile.org/Downloads/BadHosts.unx.zip.
[INFO] Refreshed blocklist http://hostsfile.mine.nu/Hosts.zip.
[NOTE] CHANGES FOUND for blocklist http://hostsfile.mine.nu/Hosts.zip.
[INFO] Refreshed blocklist http://someonewhocares.org/hosts/hosts.
[NOTE] CHANGES FOUND for blocklist http://someonewhocares.org/hosts/hosts.
[INFO] Refreshed blocklist http://sysctl.org/cameleon/hosts.
[NOTE] CHANGES FOUND for blocklist http://sysctl.org/cameleon/hosts.
[INFO] Changes found among blocklists. Extracting and preparing cached files to working directory...
[/var/cache/hostsblock/hosts-file.net.download.hosts.zip]
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile directory in one of /var/cache/hostsblock/hosts-file.net.download.hosts.zip or
/var/cache/hostsblock/hosts-file.net.download.hosts.zip.zip, and cannot find /var/cache/hostsblock/hosts-file.net.download.hosts.zip.ZIP, period.
[WARN] FAILED to unzip hosts-file.net.download.hosts.zip.
[/var/cache/hostsblock/hostsfile.mine.nu.Hosts.zip]
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile directory in one of /var/cache/hostsblock/hostsfile.mine.nu.Hosts.zip or
/var/cache/hostsblock/hostsfile.mine.nu.Hosts.zip.zip, and cannot find /var/cache/hostsblock/hostsfile.mine.nu.Hosts.zip.ZIP, period.
[WARN] FAILED to unzip hostsfile.mine.nu.Hosts.zip.
[/var/cache/hostsblock/hostsfile.org.Downloads.BadHosts.unx.zip]
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile directory in one of /var/cache/hostsblock/hostsfile.org.Downloads.BadHosts.unx.zip or
/var/cache/hostsblock/hostsfile.org.Downloads.BadHosts.unx.zip.zip, and cannot find /var/cache/hostsblock/hostsfile.org.Downloads.BadHosts.unx.zip.ZIP, period.
[WARN] FAILED to unzip hostsfile.org.Downloads.BadHosts.unx.zip.
[/var/cache/hostsblock/winhelp2002.mvps.org.hosts.zip]
End-of-central-directory signature not found. Either this file is not
a zipfile, or it constitutes one disk of a multi-part archive. In the
latter case the central directory and zipfile comment will be found on
the last disk(s) of this archive.
unzip: cannot find zipfile directory in one of /var/cache/hostsblock/winhelp2002.mvps.org.hosts.zip or
/var/cache/hostsblock/winhelp2002.mvps.org.hosts.zip.zip, and cannot find /var/cache/hostsblock/winhelp2002.mvps.org.hosts.zip.ZIP, period.
[WARN] FAILED to unzip winhelp2002.mvps.org.hosts.zip.
[INFO] Recycling old /etc/hosts into new version...
[INFO] Recycled old /etc/hosts into new version.
/usr/lib/hostsblock-common.sh: line 66: 0: command not found
[WARN] FAILED to compress /etc/hosts with 0.
[INFO] Replaced existing /etc/hosts with .
[INFO] Compiling block entries into /etc/hosts...
[INFO] Compiled block entries into /etc/hosts.
[INFO] Appending blacklisted entries to /etc/hosts...
[INFO] Appended blacklisted entries to /etc/hosts.
[INFO] /etc/hosts: 818890 urls redirected to 0.0.0.0.
[INFO] /etc/hosts: 1 urls redirected to 127.0.0.1 arch.
[INFO] /etc/hosts: 1 urls redirected to 127.0.0.1 localhost.localdomain localhost.
[INFO] Executing postprocessing...
[INFO] Postprocessing completed.
[INFO] Restarting auto...
[WARN] FAILED to restart auto.
[NOTE] Cleaned up /dev/shm/hostsblock.
[INFO] DONE.
edit: unzip, p7zip and gzip are installed.
ps: it would be really useful to get proper exit codes or at least a clear statement in hostsblock output, like "Done. Success", or "Done with errors", or "failed"... right now, there's no way of knowing 100% if my hosts file has been updated properly, esp. when running it from a script.
Last edited by ondoho (2015-04-26 11:51:12)
Offline
When I run urlcheck, I am seeing this error:
grep: /etc/dnsmasq.conf: No such file or directory
grep: /etc/dnsmasq.conf: No such file or directory
[FATAL] Checking to see if url is blocked or unblocked...
I am not using dnsmasq and I have the following setup in /etc/hostsblock/hostsblock.conf:
hostsfile="/etc/hosts" # DEFAULT. If not using a dns caching daemon
postprocess(){ #
/bin/true # DEFAULT. If not using a dns caching daemon
} #
hostshead="/etc/hostsblock/hosts.head" # If not using dns caching.
Offline
Hello gaenserich
cURL 7.42 have changed behavior and is overlapping with zero size files the cache in /var/cache/hostsblock.
I have fixed it with this code
# DOWNLOAD BLOCKLISTS
_changed=0
_notify 3 "Checking blocklists for updates..."
for _url in ${blocklists[*]}; do
_outfile=$(echo $_url | sed "s|http:\/\/||g" | tr '/%&+?=' '.')
if [ -f "$cachedir"/"$_outfile".url ]; then
_notify 4 "Url file for $cachedir/$_outfile present."
else
_notify 4 "Url file for $cachedir/$_outfile not present. Creating it..."
echo "$_url" > "$cachedir"/"$_outfile".url
fi
if [ -f "$cachedir"/"$_outfile" ]; then
_notify 4 "Cache file $cachedir/$_outfile for blocklist $_url exists. Noting its modification time."
_old_ls=$(ls -l "$cachedir"/"$_outfile")
else
_notify 4 "Cache file $cachedir/$_outfile for blocklist $_url not found. It will be downloaded."
fi
_notify 4 "Checking and, if needed, downloading blocklist $_url to $cachedir/$_outfile"
if curl $_v_curl --compressed --connect-timeout $connect_timeout --retry $retry -z "$cachedir"/"$_outfile" "$_url" -o "$cachedir"/"$_outfile".new; then
if [ "`ls -l $cachedir/$_outfile.new | cut -d' ' -f 5`" != "0" ]; then
_notify 3 "Refreshed blocklist $_url."
_new_ls=$(ls -l "$cachedir"/"$_outfile".new)
if [ "$_old_ls" != "$_new_ls" ]; then
_changed=1
_notify 2 "CHANGES FOUND for blocklist $_url."
mv "$cachedir"/"$_outfile".new "$cachedir"/"$_outfile"
else
_notify 4 "No changes for blocklist $_url."
fi
else
_notify 4 "No changes for blocklist $_url."
rm "$cachedir"/"$_outfile".new
fi
else
_notify 1 "FAILED to refresh/download blocklist $_url."
fi
done
Hasta la vista...
Offline
BUG FIXED
Extra zero-length file created after 7.42.0 appears to have broken -z option #237 https://github.com/bagder/curl/issues/237
Fixed in 7.42.1 - April 29 2015
http://curl.haxx.se/changes.html#7_42_1
Offline
I'm trying to use hostsblock on a system with a static IP address. It works fine with no DNS cache, but when I try to make it work with dnsmasq it complains that no DHCP client is found. The logic in the script seems to assume that a DHCP client must be running if we're using a DNS cacher. Is there a reason for this assumption, or have I overlooked something in the configuration?
Offline