[squid-users] Caching content on private servers

From: WebHead <webhead74@dont-contact.us>
Date: Tue, 9 Jan 2007 14:40:00 -0500

Hi -

I'm trying to figure out whether I'm even going down the right path
here - whether Squid is the right tool for the job.

I'm writing a PHP web app that queries data from one of our databases.
 One bit of information returned by the query is the url to an image
file, such as http://192.168.99.2/images/foo.jpg . The images are
stored on any of three servers, at three separate physical locations.
Each of the locations is connected back to my office via T1s. The web
server, in the DMZ port on my firewall, is at my office.

I was thinking of putting a squid server in the dmz - either on it's
own box, or on the box I'm running the PHP app on, to pull those
images, cache them & serve them to clients accessing the site. The
problem is that none of the servers hosting the images are internet
accessible, nor will they ever be. I also want to mimize traffic
across the T1s.

Is squid capable of doing this? If so, can any of you provide
information on how to go about getting started? I've never dealt with
squid before & haven't really found any documentation on my own which
tells me how to configure this sort of setup. Everything I've read
has been about configuring a proxy for people browsing the web.

If there's a better way to do this, I'd be open to hearing about it too.

Many thanks!
Mike
Received on Tue Jan 09 2007 - 12:40:02 MST

This archive was generated by hypermail pre-2.1.9 : Thu Feb 01 2007 - 12:00:01 MST