Re: [squid-users] elapsed time and keepalives (reverse proxy)

From: Eric D. Hendrickson <edh@dont-contact.us>
Date: 23 Apr 2002 10:44:27 -0500

Right, yes I understand that. I am looking for a way to measure
elapsed time to load a given object (URL), *including* any sub-objects
(other URLs) within that object.

I recognize that, at the protocol level, they are separate and
individual requests, but I'd like to be able to group them into a
"page" and measure the absolute elapsed time for that "page" and any
"sub-objects" within to load. Not just individually. Of course,
there is likely to be a lot of overlap since most browsers support
multiple simultaneous requests...

e.g.

page.html is requested at 1019575895.719 seconds, and completes 13ms
later. There are two images within that page (separate HTTP requests)
which each begin at 1019575895.720 (1ms after the first request
began). One takes 17ms to load and the other takes 18ms. Therefore
the total elapsed time for this "page" is 19ms.

Is there anything within squid or another way to do this kind of
measurement?

Thanks, Eric

> A URL identifies a single object, not a whole page.
>
> HTTP do not have a notion of "page". HTTP do not at all care how content
> are linked together, to HTTP it is the same thing if a link is something
> you click on, or a inlined image or stylesheet, all are simply other
> URL's linked from the first by it's content. HTTP do not know while
> processing the first request that it links to other URL's, only when
> seeing the requests for these other URLs is the linkage revealed to HTTP
> by the Referer header.
Received on Tue Apr 23 2002 - 09:44:47 MDT

This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:07:39 MST