Squid site restrictions
I needed a way to block some websites permanently and others outside of certain hours. After looking at some inline solutions I realised that I could easily do what was needed with squid alone.
I created the following ACLs in squid’s config file :
acl blockedsites url_regex -i "/etc/squid/blocked.txt" acl bannedsites url_regex -i "/etc/squid/banned.txt" acl lunchtime time MTWHF 12:15-13:45
The I can apply these ACLs near the end of my squid ACL rules:
http_access allow managers http_access deny blockedsites !lunchtime http_access deny bannedsites http_access allow domainusers http_access deny all
I use squid authentication here – the managers ACL refers to special users that have no restrictions. Making sure this is before the restrictive ACLs means it is applied and matched first. The domainusers ACL refers to any authorized users – unauthorized users are denied all access.
So, you can see that the access is denied to both ACLs, and the blockedsites ACL has an exception of !lunchtime. This means deny access while its not lunchtime – ACLs applied on the same line are logically ANDed.
The entries in the /etc/squid/blocked.txt and /etc/squid/banned.txt files are simple:
ebay planetfootball.com bigbrother.channel4.com
These are url_regex and because I keep them simple like this, the occurrence of, for example, ebay anywhere in the URL will match and therefore be denied.
When a new entry is added to either of the files it’s a simple matter of “/etc/init.d/squid reload” to force squid to see the changes.