[squid-users] Squid questions

From: Kishore Venkat <kishore.k.venkat_at_gmail.com>
Date: Sun, 21 Dec 2008 15:36:49 -0800

Hello everyone,

I have setup Squid 3.0 STABLE 9 for testing purposes and I have the
following questions:

1. We are trying to take steps to prevent DOS attacks and are
considering the possibility of using squid to cache pages and reducing
the load on the origin servers. The particular scenario we are
targeting is folks posting url containing member specific information
in the querystring (such as email address or memberid or coupon code)
on social networks to take advantage of promotional offerings (such as
coupons) and all of a sudden we getting a sudden burst of traffic to
our site - these would be either .jsp or .asp urls. I have tried
using the following line in our squid config:

refresh_pattern -i \.asp$ 10080 90% 999999 ignore-no-cache
override-expire ignore-private

and from my testing it appears to cache them only if there is no "?"
in the url (even if you do NOT pass any url parameters, but have the
"?" in the url, it still does not cache them - even if the .asp
contains only html code). From my understanding, there is no way to
cache .asp / .jsp pages with querystring parameters - could you please
confirm this?

I was wondering if there is way to cache these dynamic .asp pages? We
do not want all the .asp pages to go thru the squid cache as a lot of
them are dependant on data in the database and if the values in the db
changes, the content served must change as well. So, we could place
the pages that need to go thru Squid's cache is a folder called
"squid", and modify the above squid.conf line such that only those
.asp pages that are present in the "squid" folder go thru the squid
cache.

If there are other ways of preventing DOS attacks for the above
mentioned scenario, please let me know.

2. The one conern that I have is the Squid server itself being prone
to Denial of Service due to sudden bursts in traffic. Can someone
share their experience based on the implementation on your web site.

3. When using the squidclient for testing purposes, if I have a very
long url (something that is 205 characters long, for example), it
appears that the request to the original servers does NOT contain the
entire url (with all the parameters). The squidclient command
(including the -h and -p options and the 205-length url) is 261
characters long. I saw a bug to do with the length of the hostname,
but I believe that is in earlier version of Squid and NOT in squid 3
stable 9. Is there a way to test really long urls?

4. If the disk space that we allocate is completely used, do we know
what algorithm Squid uses to cache request for new pages - such as
LRU? And is this configurable?

Thanks much for all your help.

Regards,
Kishore
Received on Sun Dec 21 2008 - 23:36:52 MST

This archive was generated by hypermail 2.2.0 : Mon Dec 22 2008 - 12:00:02 MST