Re: [SQU] Interesting Question

From: Robert Collins <robert.collins@dont-contact.us>
Date: Sat, 3 Mar 2001 21:30:37 +1100

No. The point was missed.

Squid today _cannot tell what makes up a page_.

If you want granular control (page foo from site bar, but not all of
bar), then you _cannot use squid acl's_.

Joe's pointer (squid cache retention times) & mine (wget from a full
access account to make mirrors) will work.

Squid ACLs, Squid redirectors, WILL NOT.

Rob

----- Original Message -----
From: "Devin Teske" <devinteske@hotmail.com>
To: <squid-users@ircache.net>
Sent: Saturday, March 03, 2001 8:44 AM
Subject: Re: [SQU] Interesting Question

> Hmmmmmm, I like that. Simple, elegant, and intriguing. Does squid
cache
> whole pages including images, flash, movies, shockwave, cascading
style
> sheets, cgi's, and other stuff in the page?
>
> Thanks,
> Devin Teske
>
> Ps. OOPS, forgot to cc to list, dang hotmail.
>
> >Maybe a simple solution is to use 2 squid caches, one which Teacher
> >uses has full access to the web, a second which the students PCs are
> >directed to, can only get pages from the teacher's parent cache, it
> >cannot resolve misses itself.
> >
> >no database necessary.
> >
> >jm
> >
> >
> >
> >
> >
> >>Ok, I believe I have a solution for you....
> >>
> >>We have a similar situation here at our school,
> >>although we do not use databases....
> >>
> >>To give a brief description, one of your proxy servers
> >>is dedicated to the lower school, and we have setup
> >>squid so that the users can only go to webpages that
> >>are listed in an ACL "gooddomains" but if they try to
> >>go to another website, they are disallowed....ok??
> >>
> >>I was given the task by our IT technician of finding a
> >>way so that teachers could add a webpage link (URL)
> >>into the ACL "gooddomains" so that the lowerschool
> >>could access that site. (This was done by our IT
> >>technician writing an ASP script for this) So, I wrote
> >>a bash script that "loops4mail" and if it contains an
> >>expression such as:
> >>
> >>add gooddomains
> >>@@@@www.yahoo.com
> >>
> >>that URL would be appended to the bottom of the ACL
> >>"gooddomains" squid would reconfigure itself and the
> >>site named above would then be allowed to be viewed.
> >>
> >>Similarly, if you wanted that URL removing so that it
> >>could no longer be accessed you would put:
> >>
> >>remove gooddomains
> >>%%%%www.yahoo.com
> >>
> >>The reason why @@@@ and %%%% are used for add and
> >>remove respectfully is because I realised that I was
> >>picking up www which was in the hostname, so I had to
> >>come up with a prefix, otherwise I was finding that
> >>the hostname of the proxy server was being added as an
> >>ACL, and obviously the computer could not resolve to
> >>it!!
> >>
> >>If you would like a copy of my bash script "loop4mail"
> >>let me know and I would be only too happy.
> >>
> >>Ok, so in squid.conf, you will need to do the
> >>following (I think,....this is only off the top of my
> >>head!!)
> >>
> >>acl_src "/var/squid/gooddomains"
> >>
> >>and then:
> >>
> >>http_access allow all
> >>http_access deny gooddomains
> >>
> >>(You might need to ask someone about this... I have
> >>forgotten!!)
> >>
> >>Furthermore, if you have not already done so, you will
> >>have to make sure that sendmail is configured
> >>correctlyto get this script to work (thats if you
> >>decide to use it :))
> >>
> >>I believe that solves your problem...???
> >>
> >>Thomas Adam
> >>
> >>
> >>--- Henrik Nordstrom <hno@hem.passagen.se> wrote: > So
> >>go back to think about what the problem really
> >>> is. Almost everything
> >>> can be solved in this world (networking, web,
> >>> proxies, ...), it is only
> >>> a question about finding the correct approach.
> >>>
> >>> --
> >>> Henrik Nordstrom
> >>> Squid hacker
> >>>
> >>>
> >>> Devin Teske wrote:
> >>> >
> >>> > I'm about to start cursing. I've come way to far
> >>> to stop now. There HAS to
> >>> > be a solution to this. You do know what the end
> >>> product is right? Teachers
> >>> > go online, somehow that add a link to a database.
> >>> Students go online,
> >>> > students can only go to those pages and nowhere
> >>> else. This is the final
> >>> > goal. I will never give up on this.
> >>> >
> >>> > Thanks,
> >>> > Devin Teske
> >>> >
> >>> > >Devin Teske wrote:
> >>> > > >
> >>> > > > I was studying proxy servers and the protocols
> >>> and something came to
> >>> > >mind.
> >>> > > > When the client requestsa page from the
> >>> server, it will request a
> >>> > >keep-alive
> >>> > > > connection. After all the contents of the page
> >>> have been loaded it will
> >>> > > > close the connection.
> >>> > >
> >>> > >Many pages can be server using one connection,
> >>> and all known browsers
> >>> > >utlizes more than one connection to download the
> >>> objects that makes up
> >>> > >one page...
> >>> > >
> >>> > >
> >>> > >--
> >>> > >Henrik Nordstrom
> >>> > >Squid hacker
> >>> >
> >>>
> >>_________________________________________________________________
> >>> > Get your FREE download of MSN Explorer at
> >>> http://explorer.msn.com
> >>>
> >>> --
> >>> To unsubscribe, see
> >>> http://www.squid-cache.org/mailing-lists.html
> >>>
> >>
> >>
> >>=====
> >>Thomas Adam
> >>Linux Co-ordinator for The Purbeck School
> _________________________________________________________________
> Get your FREE download of MSN Explorer at http://explorer.msn.com
>
> --
> To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
>
>

--
To unsubscribe, see http://www.squid-cache.org/mailing-lists.html
Received on Sat Mar 03 2001 - 03:32:09 MST

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:58:28 MST