This is a read-only archive. Find the latest Linux articles, documentation, and answers at the new Linux.com!

Linux.com

Feature: Networking

Speed up your Internet access using Squid's refresh patterns

By Solomon Asare on November 20, 2008 (9:00:00 AM)

Share    Print    Comments   

Bandwidth limitation is still a problem for a lot of people who connect to the Internet. You can improve your available bandwidth by installing Squid caching proxy server on your network with configuration parameters that will increase your byte hit rate, giving you about 30-60% more bandwidth.

Squid can be fine-tuned to satisfy a host of needs. The stable version has at least 249 configurable parameters. The heavily commented configuration file, usually found in /etc/squid.conf, is more than 4,600 lines long. This can be intimidating to even experienced administrators. All settings are to be modified in this file.

You need a big cache that will not fill up in less than a week, and preferably should take more than a month to fill up. The actual size will be dependent on the volume of traffic on your network. The bigger the size of your storage, the greater the probability that the object someone is requesting for will already be in your cache.

In addition to the memory required for your operating system and Squid to run, you will need memory of about 1% of your cache size to keep the database of your cache in memory. That is, for a cache of 100GB disk space, you will need about 1GB RAM, in addition to about 100MB for the OS and Squid.

The default maximum size of objects that may be cached by Squid is 4MB. Nowadays, this is too low for the media-rich Internet. If your clients download a lot of video and software packages, you can increase this to a figure more representative of the maximum size of files that your clients normally download -- say 100MB.

Refresh patterns determine what is saved and served from the cache. Ideally, you would want your squid to follow the directions of the Web servers serving the content to determine what is cacheable and for how long. These directions are set as HTTP headers that are processed and understood by Squid. Unfortunately, the directions given by most servers are the Web servers' defaults, and do not produce significant bandwidth savings.

Refresh patterns are of the format:

refresh_pattern [-i] regex min percent max [options]

where min and max are time values in minutes and percent is a percentage figure. The options are:

  • override-expire -- ignores the expire header from the Web server.
  • override-lastmod -- ignores the last modified date header from the Web server.
  • reload-into-ims -- a reload request from a client is converted into an If-Modified-Since request.
  • ignore-reload -- a client's no-cache or "reload from origin server" directive is ignored. The request can therefore be satisfied from the cache if available.
  • ignore-no-cache -- a no-cache directive from the Web server which makes an object non-cacheable is ignored.
  • ignore-no-store -- a no-store directive from the Web server which makes an object non-cacheable is ignored.
  • ignore-private -- a private directive from the Web server which makes an object non-cacheable is ignored.
  • ignore-auth -- objects requiring authorisation are non-cacheable. This option overrides this limitation.
  • refresh-ims -- a refresh request from a client is converted into an If-Modified-Since request.

Consult your configuration file to see which of these options are available in your version of Squid.

Refresh patterns are effective if there is no expire header from the origin server, or your refresh pattern has an override-expire option. Example:

refresh_pattern -i \.gif$ 1440 20% 10080.

This says:

  • If there is no expire header for all objects whose names end in .gif or .GIF (that is, image files) then:
  • if the age (that is how long the object has been on your cache server) is less than 1,440 minutes, then consider it fresh and serve it and stop
  • else if the age is greater than 10,080 minutes, consider it stale and go to the origin server for a fresh copy and stop
  • else if the age is in between the min and max values, use the lm-factor to determine freshness. lm-factor is the ratio of the age on your cache server to the period since creation or modification of the object on the origin server as a percentage. So if the object was created 10,000 minutes ago on the origin server and it has been on my cache server for 1,800 minutes (that is the age) the lm-factor is 1,800/10,000 = 18%.
  • If the lm factor is less than the percent in our refresh pattern (20%) then the object is considered fresh; serve it and stop
  • else the object is stale, go for a fresh copy from the origin server.

For objects that scarcely change under the same file name, such as video, images, sound, executables, and archives, you can modify the refresh pattern to consider them fresh on your Squid for a longer time, increasing the probability of having hits. For example, you could modify our refresh pattern above to:

refresh_pattern ^ftp: 1440 20% 10080 refresh_pattern ^gopher: 1440 0% 1440 refresh_pattern -i \.(gif|png|jpg|jpeg|ico)$ 10080 90% 43200 override-expire ignore-no-cache ignore-no-store ignore-private refresh_pattern -i \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|x-flv)$ 43200 90% 432000 override-expire ignore-no-cache ignore-no-store ignore-private refresh_pattern -i \.(deb|rpm|exe|zip|tar|tgz|ram|rar|bin|ppt|doc|tiff)$ 10080 90% 43200 override-expire ignore-no-cache ignore-no-store ignore-private refresh_pattern -i \.index.(html|htm)$ 0 40% 10080 refresh_pattern -i \.(html|htm|css|js)$ 1440 40% 40320 refresh_pattern . 0 40% 40320

Sometimes, for no good reason, at least from our perspective, origin servers, such as youtube.com, do everything to make it difficult or impossible for you to cache content. The options above should help you to overcome some of these limitations.

Refresh patterns are matched against all requests in order from the top until there is a matching rule. The last rule is a catch-all and will match any request that is not satisfied by any of the rules above it. There are normally separate catch-all default rules for other protocols like FTP and gopher at the very top of the list so as to exempt those protocols from the patterns below them.

By default, Squid will not cache dynamic content. Dynamic content is determined by matching against either "cgi-bin" or "?". This feature used to be activated via the "hierarchy_stoplist" and "cache deny" settings in older versions of Squid. In recent versions, starting with 3.1, this feature is activated via a refresh pattern such as refresh_pattern (/cgi-bin/|\?) 0 0% 0. This enables you to specify sites that serve dynamic content that could be made cacheable in bypass rules. For example, you could set up a refresh pattern such as:

refresh_pattern -i movies.com/.* 10080 90% 43200 refresh_pattern (/cgi-bin/|\?) 0 0% 0

Then, even if content from movies.com is served with "?" in their URL, the content will still be cached if all other conditions are met.

For the older versions of Squid, you will have to define an access control list (ACL) for the content providers you wish to make exceptions for, and use cache accept to exempt it before the cache deny rule. The following example is from the Squid wiki:

# Let the client's favourite video site through acl youtube dstdomain .youtube.com cache allow youtube # Now stop other dynamic stuff being cached hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin \? cache deny QUERY

Squid makes a lot of DNS requests, one dns request for each http request. Install a caching DNS server on your server and have Squid use it so as to cut down on your DNS requests. This how-to may be helpful.

Sites like Microsoft's windowsupdate.com, which virtually all Windows PCs update their OS from, are among the most bandwidth-intensive sites on some networks. Unfortunately, they are not cacheable because they offer partial responses (http return code 206), which Squid presently does not cache. Where you have control over the client machines, you can install Microsoft's Update Server to handle caching for windowsupdate. If you cannot use the Update Server, you can use Squid's delay pools -- a bandwidth management technique -- to limit the portion of bandwidth that windowsupdate consumes during your peak periods. The clients will then have to be online during off-peak periods to complete their updates.

Below, we configure one global delay pool at 64Kbps (8KBps). Traffic for which the ACL of destination domain is windowsupdate.com during the peak period of 10:00-16:00 will be limited to 64Kbps.

acl winupdate dstdomain .windowsupdate.com acl peakperiod time 10:00-16:00 delay_pools 1 delay_class 1 1 # 64 Kbit/s delay_parameters 1 8000/8000 delay_access 1 allow winupdate peakperiod

After making changes like the ones above, my Squid's byte hit rate increased from about 8% to between 26-37%. If you are doing 33%, it means a third of all traffic is coming from your cache, and not from slower links across the Internet. For monitoring and log analysis to determine the performance of your Squid, you can use squid3-client and calamaris.

Solomon Asare is the developer of DelXy, an HTTP compression service.

Share    Print    Comments   

Comments

on Speed up your Internet access using Squid's refresh patterns

Note: Comments are owned by the poster. We are not responsible for their content.

Speed up your Internet access using Squid's refresh patterns

Posted by: Anonymous [ip: 192.168.218.100] on November 20, 2008 10:07 AM
Don't forget about adzapper.sf.net which can help save even more bandwidth by not downloading those pesky adverts.

#

Speed up your Internet access using Squid's refresh patterns

Posted by: Norberto Bensa on November 21, 2008 02:01 AM
Thanks, but doesn't work with ubuntu 8.04 server. ignore-expire and ignore-no-store are unrecognized options with the squid-2.6.18 provided in hardy :-(

#

Re: Speed up your Internet access using Squid's refresh patterns

Posted by: Anonymous [ip: 192.168.218.100] on November 22, 2008 07:14 PM
@Norberto:
apt-get remove squid && apt-get install squid3

#

Doesn't work on CENTOS 5

Posted by: Anonymous [ip: 12.109.32.226] on November 21, 2008 02:26 PM
Reloading Squid I get
redreshAddToList: Unknown option
for each of the lines in your example.
ben

#

Speed up your Internet access using Squid's refresh patterns

Posted by: Anonymous [ip: 194.225.241.122] on November 22, 2008 06:47 AM
There is no such an option as "ignore-expire", that's "override-expire" which squid.conf suggests not to set. quote:" override-expire enforces min age even if the server sent an explicit expiry time (e.g., with the Expires: header or Cache-Control: max-age). Doing this VIOLATES the HTTP standard. Enabling this feature could make you liable for problems which it causes."
Edit this article.

#

Re: Speed up your Internet access using Squid's refresh patterns

Posted by: Solomon Asare on November 22, 2008 01:16 PM
Hi,
for "ignore-expire" instead of "override-expire" sorry for the error and thanks for the correction.
[Modified by: Solomon Asare on November 22, 2008 01:45 PM]

#

Speed up your Internet access using Squid's refresh patterns

Posted by: Anonymous [ip: 70.120.89.6] on November 22, 2008 08:33 PM
I have been using squid for about 5 years now on my local home net. I think the above is a bit extreme and would recommend this:

refresh_pattern ^ftp: 144000 20% 1008000
refresh_pattern -i \.(gif|png|jpg|jpeg|ico|bmp)$ 260000 90% 260009 override-expire
refresh_pattern -i \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|x-flv|mpg|wma|ogg|wmv|asx|asf)$ 260000 90% 260009 override-expire
refresh_pattern -i \.(deb|rpm|exe|zip|tar|tgz|ram|rar|bin|ppt|doc|tiff|pdf|uxx)$ 260000 90% 260009 override-expire
refresh_pattern -i \.index.(html|htm)$ 1440 90% 40320
refresh_pattern -i \.(html|htm|css|js)$ 1440 90% 40320
refresh_pattern (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4

"ignore-no-cache ignore-no-store ignore-private" I think is a bit too much. If someone took the time to set something to no-cache then I would listen to website. What I posted will keep media files fresh in the cache for 6 months before refreshing. The only problems that I have ran into with my set up is weather radar sites will end up with the radar images in the cache which won't refresh for 6 months... So therefor you have to make some deny cache acl's and put common weather websites in it like weather.com, noaa, etc

So the default conf on any squid below version 2.7 does not cache any dynamic content (urls with a "?" in it). So if you want to cache sites like youtube and google maps remove the deny cache tags in squid.conf
http://wiki.squid-cache.org/ConfigExamples/DynamicContent
http://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube

All I did in squid 2.7 was remove
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY

and add
refresh_pattern (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4

to the bottom and that got it working

#

This story has been archived. Comments can no longer be posted.



 
Tableless layout Validate XHTML 1.0 Strict Validate CSS Powered by Xaraya