RE: [squid-users] squid: Memory utilization higher than expected since moving from 3.3 to 3.4 and Vary: working

From: Martin Sperl <Martin.Sperl_at_amdocs.com>
Date: Thu, 17 Jul 2014 11:34:16 +0000

> So my guess would be that for a specific message there would be 2 instances
> of the same body in memory.
> And only one would be referenced.
>
> And in the case above this would obviously mean 20K of memory "wasted".
>
> I might play with that idea and create a core-dumps of squid after loading a
> very specific html page...

OK - I have to admit this one idea was wrong!
I have confirmed that the HTML payload data can only get found _once_ in the core dump file.

But what I found is that the URL shows up 4 time in the cordump.
But this may be expected behavior...

Note that the coredump was taken about an hour after the request was done,
So any temporary buffers should get reused and the strings should have been overwritten...

BTW: Stats show the following histogram of "^http://" strings and how often they occur in memory:
Occurences_of_string count_of_distinct_strings
1 861
2 771297
3 86
4 25626
5 31
6 2378
7 5
8 1205
9 1
10 591
11 1
12 306
13 3
14 217
15 1
16 100
17 2
18 84
20 74
21 3
22 74
24 45
25 4
26 29
28 19
30 5
32 1
34 6
36 6
38 7
40 2
41 1
42 1
44 1
48 1
50 1
54 1
56 1
71 1
114 1
15555 1

With a total 803080 http://... strings

The strange thing is the pattern where we see a higher number of strings on even "counts" than on odd ones.
But I assume it is related to the vary header handling (as explained), but then I would expect that the vary information would only be used once and

Martin


This message and the information contained herein is proprietary and confidential and subject to the Amdocs policy statement,
you may review at http://www.amdocs.com/email_disclaimer.asp
Received on Thu Jul 17 2014 - 11:34:25 MDT

This archive was generated by hypermail 2.2.0 : Thu Jul 17 2014 - 12:00:04 MDT