Re: [squid-users] what are the Pros and cons filtering urls using squid.conf?

From: Eliezer Croitoru <eliezer_at_ngtech.co.il>
Date: Sun, 09 Jun 2013 16:06:54 +0300

On 6/9/2013 3:52 PM, Marcus Kool wrote:
> I do not understand the performance figure. Can you give more details ?
>
> Best regards,
> Marcus
Yes indeed.
The performance of an ICAP service is in another level from helper since
it has concurrency capability in it.
one request doesn't affect another one.
one request is being handled while others are handled at the same time.
My small ICAP service proved to work with 8k requests being
HANDLED(filtered is another story about blocking) per second on an intel
atom cpu which is suppose to take much lower load.

The above proved that INTEL ATOM is a capable system and also proved
that squid can handle a lot of requests per second on a very slow CPU
which states that on a faster CPU with SMP capabilities can take much
higher load then people actually state.

ufdbGuard is a nice product but if it reloads the database it means that
the filter is not real time and cannot work for a strict system which
secure from malicious software but can be OK for a more failure
tolerance users.
the basic rule for mental institution for example must be "first block
all" and then allow even in a time when there is a reload of the DB.
or in cases of army or any HIGH and SECURED institution.

when talking about kids you dont want them to see pictures of naked and
hot girls even by accident.

When you can handle concurrent requests you dont need to think about the
load per second but concurrent load on the service.

I will write more but later.
ask me anything

Eliezer
Received on Sun Jun 09 2013 - 13:07:33 MDT

This archive was generated by hypermail 2.2.0 : Sun Jun 09 2013 - 12:00:04 MDT