/etc/hosts file to block malicious sites at the operating system level is an effective way of ensuring that none of your applications will access any of these sites, ever, and has the advantage of removing the need for a separate browser plugin for every browser you might possibly use. But maintaining the
/etc/hosts file involves doing work and this is where Steven Black‘s hosts comes in handy.
This repository consolidates several reputable hosts files, and merges them into a unified hosts file with duplicates removed. This repo provides several hosts files tailored to you need to block.
Using it is simple. Clone the repository, update the
myhosts file with any custom host records you may have, and add any domains you don’t want to block to the
whitelist. Then build your hosts file:
There are a number of switches you can use (all of which are documented in the readme file) which allow you to control which types of sites to block and whether you want to automatically replace your existing
This all works very nicely indeed, but I’m lazy. So I knocked together a short script to grab any updates from the repository and rebuild my hosts file:
#! /bin/bash # Automatically update hosts file # Change to the correct folder and do a git pull cd /home/paul/Stuff/hosts git pull origin master # And update the hosts file python updateHostsFile.py -a -r
And put it in
This means I can use a systemd service and timer to execute this every Saturday afternoon.
[Unit] Description=Auto-update hosts file [Service] Type=oneshot Environment=DISPLAY=:0 ExecStart=/usr/local/bin/hosts StandardOutput=journal [Install] WantedBy=basic.target
[Unit] Description=Auto Update Hosts File [Timer] OnCalendar=Sat 14:00:00 Persistent=true Unit=hosts.service [Install] WantedBy=basic.target
And, so far, it all appears to be working very nicely indeed.