> On Oct 9, 2017, at 4:57 PM, Jim Mellander <jmellan...@lbl.gov> wrote:
> 
> Well, I found pathological behavior with BaseList::remove
> 
> Instrumenting it with a printf of num_entries & i after the for loop, running 
> against a test pcap then summarizing with awk gives:
> 
> Count, num_entries, i
> 
> 1 3583 3536
> 1 3584 3537
> 1 3623 3542
> 1 3624 3543
> 1 3628 3620
> ...
> 5 5379 5376
> 5 5380 5377
> 7 5499 5496
> 7 5500 5497
> 10 5497 5494
> 10 5498 5495
> 26 3 2
> 5861 3 1
> 13714 4 4
> 14130 4 1
> 34914 4 0
> 74299 3 3
> 1518194 2 1
> 2648755 2 2
> 8166358 3 0
> 13019625 2 0
> 62953139 0 0
> 71512938 1 1
> 104294506 1 0
> 
> there are 286 instances where the list has over 3000 entries, and the desired 
> entry is near the end...  That linear search has got to be killing 
> performance, even though its uncommon  :-(
> 
> The case of num_entries being 0 can be optimized a bit, but is relatively 
> minor.
> 
> Now, I'll see if I can identify the offending List
> 
> Jim
> 

A for loop over an empty table/set causes the "0 0" entries.  Something related 
to the "cookies" it uses for iteration.

Not sure what causes the "1 1" cases.

What I also find interesting are the "1 0" entries.  I wonder how many of those 
cases are followed up by the list itself being destroyed. 

Something allocating and then destroying 104,294,506 lists that only ever have 
a single item in them.  That's a lot of work for nothing.

— 
Justin Azoff

_______________________________________________
bro-dev mailing list
bro-dev@bro.org
http://mailman.icsi.berkeley.edu/mailman/listinfo/bro-dev

Reply via email to