For this case there is no leak and still alloc fail=135.
# mdb -p 4846
Loading modules: [ ld.so.1 libumem.so.1 libc.so.1 libuutil.so.1 ]
> ::umastat
cache buf buf buf memory alloc alloc
name size in use total in use succeed fail
------------------------- ------ ------ ------ --------- --------- -----
umem_magazine_1 16 45 202 8192 248 0
umem_magazine_3 32 95 126 8192 173 0
umem_magazine_7 64 150 168 16384 282 0
umem_magazine_15 128 34 42 8192 60 0
umem_magazine_31 256 0 24 8192 10 0
umem_magazine_47 384 0 9 4096 8 0
umem_magazine_63 512 0 7 4096 8 0
umem_magazine_95 768 0 4 4096 3 0
umem_magazine_143 1152 18 27 36864 36 0
umem_slab_cache 56 366 450 36864 568 0
umem_bufctl_cache 24 0 0 0 0 0
umem_bufctl_audit_cache 192 7088 7110 1617920 7860 0
umem_alloc_8 8 0 0 0 0 0
umem_alloc_16 16 294 425 20480 4653 0
umem_alloc_32 32 641 768 49152 471695 0
umem_alloc_48 48 978 1122 90112 556139 0
umem_alloc_64 64 1624 2944 376832 4695289 0
umem_alloc_80 80 727 756 86016 29302 0
umem_alloc_96 96 290 320 40960 21999 0
umem_alloc_112 112 38 84 12288 15671 0
umem_alloc_128 128 21 63 12288 20729 0
umem_alloc_160 160 8 42 8192 14648 0
umem_alloc_192 192 3 128 32768 6024 0
umem_alloc_224 224 7 32 8192 3500 0
umem_alloc_256 256 2 24 8192 4297 0
umem_alloc_320 320 3 20 8192 8889 0
umem_alloc_384 384 11 27 12288 10014 0
umem_alloc_448 448 0 16 8192 13490 0
umem_alloc_512 512 0 14 8192 680 0
umem_alloc_640 640 8 22 16384 4789 0
umem_alloc_768 768 9 27 24576 2930 0
umem_alloc_896 896 1 20 20480 3735 0
umem_alloc_1152 1152 6 20 24576 145 0
umem_alloc_1344 1344 1 8 12288 10 0
umem_alloc_1600 1600 0 7 12288 31 0
umem_alloc_2048 2048 1 9 20480 128 0
umem_alloc_2688 2688 9 35 102400 4671 0
umem_alloc_4096 4096 1 7 57344 1147 0
umem_alloc_8192 8192 87 111 1363968 56215 0
umem_alloc_12288 12288 30 34 557056 666 0
umem_alloc_16384 16384 2 3 61440 4 0
------------------------- ------ ------ ------ --------- --------- -----
Total [umem_internal] 1753088 9256 0
Total [umem_default] 3055616 5951490 0
------------------------- ------ ------ ------ --------- --------- -----
vmem memory memory memory alloc alloc
name in use total import succeed fail
------------------------- --------- ---------- --------- --------- -----
sbrk_top 216223744 216535040 0 2089 135
sbrk_heap 216223744 216223744 216223744 2089 0
vmem_internal 1372160 1372160 1372160 173 0
vmem_seg 1310720 1310720 1310720 160 0
vmem_hash 22528 24576 24576 6 0
vmem_vmem 46200 55344 36864 15 0
umem_internal 1871936 1875968 1875968 452 0
umem_cache 42968 57344 57344 41 0
umem_hash 58880 61440 61440 35 0
umem_log 131776 135168 135168 3 0
umem_firewall_va 0 0 0 0 0
umem_firewall 0 0 0 0 0
umem_oversize 209684987 209784832 209784832 916 0
umem_memalign 0 0 0 0 0
umem_default 3055616 3055616 3055616 546 0
------------------------- --------- ---------- --------- --------- -----
> ::help umastat
NAME
umastat - umem allocator stats
SYNOPSIS
::umastat
ATTRIBUTES
Target: proc
Module: libumem.so.1
Interface Stability: Unstable
> ::findleaks -dv
findleaks: maximum buffers => 6898
findleaks: actual buffers => 4981
findleaks:
findleaks: potential pointers => 201355793
findleaks: dismissals => 174141000 (86.4%)
findleaks: misses => 26203919 (13.0%)
findleaks: dups => 1005902 ( 0.4%)
findleaks: follows => 4972 ( 0.0%)
findleaks:
findleaks: elapsed wall time => 7 seconds
findleaks:
BYTES LEAKED VMEM_SEG CALLER
12288 9 fffffd7f8cdf6000 MMAP
16384 1 fffffd7fff20d000 MMAP
4096 1 fffffd7fff1e2000 MMAP
16384 1 fffffd7ffcffb000 MMAP
4096 1 fffffd7ffc7f9000 MMAP
8192 1 fffffd7f8d3ec000 MMAP
8192 1 fffffd7f8d1de000 MMAP
4096 1 fffffd7f8d1ce000 MMAP
8192 1 fffffd7f8cdff000 MMAP
------------------------------------------------------------------------
Total 9 oversized leaks, 81920 bytes
CACHE LEAKED BUFCTL CALLER
----------------------------------------------------------------------
Total 0 buffers, 0 bytes
mmap(2) leak: [fffffd7f8cdf6000, fffffd7f8cdf9000), 12288 bytes
mmap(2) leak: [fffffd7fff20d000, fffffd7fff211000), 16384 bytes
mmap(2) leak: [fffffd7fff1e2000, fffffd7fff1e3000), 4096 bytes
mmap(2) leak: [fffffd7ffcffb000, fffffd7ffcfff000), 16384 bytes
mmap(2) leak: [fffffd7ffc7f9000, fffffd7ffc7fa000), 4096 bytes
mmap(2) leak: [fffffd7f8d3ec000, fffffd7f8d3ee000), 8192 bytes
mmap(2) leak: [fffffd7f8d1de000, fffffd7f8d1e0000), 8192 bytes
mmap(2) leak: [fffffd7f8d1ce000, fffffd7f8d1cf000), 4096 bytes
mmap(2) leak: [fffffd7f8cdff000, fffffd7f8ce01000), 8192 bytes
> ::quit
#
-----Original Message-----
From: ext David Lutz [mailto:[email protected]]
Sent: Friday, January 16, 2009 6:38 PM
To: Pavesi, Valdemar (NSN - US/Boca Raton)
Cc: venkat; [email protected]
Subject: Re: RE: [dtrace-discuss] C++ Applications with Dtrace
If I understand it correctly, the alloc fail for sbrk_top is just
an indication that the heap had to be grown, which is different
than other failures, which would indicate that we ran out of
memory.
Have a look at:
http://src.opensolaris.org/source/xref/onnv/onnv-gate/usr/src/lib/libume
m/common/vmem_sbrk.c
David
----- Original Message -----
From: "Pavesi, Valdemar (NSN - US/Boca Raton)" <[email protected]>
Date: Friday, January 16, 2009 3:24 pm
> Hello,
>
> I have a example of memory leak.
>
> What does means the alloc fail= 335 ?
>
>
> # mdb -p 1408
> Loading modules: [ ld.so.1 libumem.so.1 libc.so.1 libuutil.so.1 ]
> > ::findleaks -dv
> findleaks: maximum buffers => 14920
> findleaks: actual buffers => 14497
> findleaks:
> findleaks: potential pointers => 316574898
> findleaks: dismissals => 309520985 (97.7%)
> findleaks: misses => 6929221 ( 2.1%)
> findleaks: dups => 110601 ( 0.0%)
> findleaks: follows => 14091 ( 0.0%)
> findleaks:
> findleaks: elapsed wall time => 54 seconds
> findleaks:
> BYTES LEAKED VMEM_SEG CALLER
> 4096 4 fffffd7ffc539000 MMAP
> 16384 1 fffffd7ffe83d000 MMAP
> 4096 1 fffffd7ffe812000 MMAP
> 8192 1 fffffd7ffd7bc000 MMAP
> 24016 397 124a2a0
libstdc++.so.6.0.8`_Znwm+0x1e
>
------------------------------------------------------------------------
> Total 401 oversized leaks, 9567120 bytes
>
> CACHE LEAKED BUFCTL CALLER
> 00000000004cf468 1 000000000050ed20
libstdc++.so.6.0.8`_Znwm+0x1e
> 00000000004cf468 1 000000000050c000
libstdc++.so.6.0.8`_Znwm+0x1e
> 00000000004cf468 1 000000000050ea80
libstdc++.so.6.0.8`_Znwm+0x1e
> 00000000004cf468 1 000000000050c0e0
libstdc++.so.6.0.8`_Znwm+0x1e
> 00000000004cf468 1 000000000050ee00
libstdc++.so.6.0.8`_Znwm+0x1e
> ----------------------------------------------------------------------
> Total 5 buffers, 80 bytes
>
> mmap(2) leak: [fffffd7ffc539000, fffffd7ffc53a000), 4096 bytes
> mmap(2) leak: [fffffd7ffe83d000, fffffd7ffe841000), 16384 bytes
> mmap(2) leak: [fffffd7ffe812000, fffffd7ffe813000), 4096 bytes
> mmap(2) leak: [fffffd7ffd7bc000, fffffd7ffd7be000), 8192 bytes
> umem_oversize leak: 397 vmem_segs, 24016 bytes each, 9534352 bytes
total
> ADDR TYPE START END
SIZE
> THREAD TIMESTAMP
> 124a2a0 ALLC 1252000 1257dd0
24016
> 1 56bd6f2a6fe1
> libumem.so.1`vmem_hash_insert+0x90
> libumem.so.1`vmem_seg_alloc+0x1c4
> libumem.so.1`vmem_xalloc+0x50b
> libumem.so.1`vmem_alloc+0x15a
> libumem.so.1`umem_alloc+0x60
> libumem.so.1`malloc+0x2e
> libstdc++.so.6.0.8`_Znwm+0x1e
> libstdc++.so.6.0.8`_Znam+9
>
>
>
> > ::umastat
> cache buf buf buf memory alloc
alloc
> name size in use total in use succeed
fail
> ------------------------- ------ ------ ------ --------- ---------
-----
> umem_magazine_1 16 5 101 4096 6
> 0
> umem_magazine_3 32 356 378 24576 356
> 0
> umem_magazine_7 64 20 84 8192 92
> 0
> umem_magazine_15 128 11 21 4096 11
> 0
> umem_magazine_31 256 0 0 0 0
> 0
> umem_magazine_47 384 0 0 0 0
> 0
> umem_magazine_63 512 0 0 0 0
> 0
> umem_magazine_95 768 0 0 0 0
> 0
> umem_magazine_143 1152 0 0 0 0
> 0
> umem_slab_cache 56 638 650 53248 638
> 0
> umem_bufctl_cache 24 0 0 0 0
> 0
> umem_bufctl_audit_cache 192 15328 15336 3489792 15328
> 0
> umem_alloc_8 8 0 0 0 0
> 0
> umem_alloc_16 16 79 170 8192 2098631
> 0
> umem_alloc_32 32 267 320 20480 306
> 0
> umem_alloc_48 48 4653 4692 376832 6028
> 0
> umem_alloc_64 64 5554 5568 712704 12642
> 0
> umem_alloc_80 80 2492 2520 286720 5185
> 0
> umem_alloc_96 96 492 512 65536 654
> 0
> umem_alloc_112 112 95 112 16384 103
> 0
> umem_alloc_128 128 38 42 8192 42
> 0
> umem_alloc_160 160 12 21 4096 86
> 0
> umem_alloc_192 192 2 16 4096 2
> 0
> umem_alloc_224 224 5 16 4096 848
> 0
> umem_alloc_256 256 1 12 4096 1
> 0
> umem_alloc_320 320 7 1010 413696 560719
> 0
> umem_alloc_384 384 34 36 16384 41
> 0
> umem_alloc_448 448 5 8 4096 10
> 0
> umem_alloc_512 512 1 7 4096 2
> 0
> umem_alloc_640 640 11 22 16384 16
> 0
> umem_alloc_768 768 2 9 8192 424
> 0
> umem_alloc_896 896 1 4 4096 2
> 0
> umem_alloc_1152 1152 11 20 24576 127
> 0
> umem_alloc_1344 1344 4 40 61440 17179
> 0
> umem_alloc_1600 1600 3 7 12288 5
> 0
> umem_alloc_2048 2048 2 9 20480 6
> 0
> umem_alloc_2688 2688 5 7 20480 10
> 0
> umem_alloc_4096 4096 6 7 57344 335
> 0
> umem_alloc_8192 8192 118 119 1462272 565
> 0
> umem_alloc_12288 12288 20 21 344064 485
> 0
> umem_alloc_16384 16384 1 1 20480 1
> 0
> ------------------------- ------ ------ ------ --------- ---------
-----
> Total [umem_internal] 3584000 16431
> 0
> Total [umem_default] 4001792 2704455
> 0
> ------------------------- ------ ------ ------ --------- ---------
-----
>
> vmem memory memory memory alloc
alloc
> name in use total import succeed
fail
> ------------------------- --------- ---------- --------- ---------
-----
> sbrk_top 25309184 25399296 0 3192
335
> sbrk_heap 25309184 25309184 25309184 3192
> 0
> vmem_internal 2965504 2965504 2965504 366
> 0
> vmem_seg 2875392 2875392 2875392 351
> 0
> vmem_hash 51200 53248 53248 7
> 0
> vmem_vmem 46200 55344 36864 15
> 0
> umem_internal 3788864 3792896 3792896 900
> 0
> umem_cache 42968 57344 57344 41
> 0
> umem_hash 142336 147456 147456 36
> 0
> umem_log 131776 135168 135168 3
> 0
> umem_firewall_va 0 0 0 0
> 0
> umem_firewall 0 0 0 0
> 0
> umem_oversize 14130869 14413824 14413824 1286
> 0
> umem_memalign 0 0 0 0
> 0
> umem_default 4001792 4001792 4001792 638
> 0
> ------------------------- --------- ---------- --------- ---------
-----
> >
>
>
> -----Original Message-----
> From: [email protected]
> [mailto:[email protected]] On Behalf Of ext David
> Lutz
> Sent: Friday, January 16, 2009 6:07 PM
> To: venkat
> Cc: [email protected]
> Subject: Re: [dtrace-discuss] C++ Applications with Dtrace
>
> Hi Venkat,
>
> I believe "alloc succeed" is a count of memory requests that
> were successful. That memory may have been freed later,
> so it doesn't necessarily point to the reason for a growing
> memory foot print. The column to be concerned with is
> "memory in use".
>
> David
>
> ----- Original Message -----
> From: venkat <[email protected]>
> Date: Friday, January 16, 2009 2:44 pm
>
> > Hi david,
> >
> >
> > What is allocated succeed block from umastat dcmd . that value
> is
>
> > keep on increasing . Is that memory occupieng by process?
> > like that way my process memory usage also keep on increasing ?
> >
> > can u clarify plz ?
> >
> >
> > Thanks,
> > Venkat
> > --
> > This message posted from opensolaris.org
> > _______________________________________________
> > dtrace-discuss mailing list
> > [email protected]
> _______________________________________________
> dtrace-discuss mailing list
> [email protected]
_______________________________________________
dtrace-discuss mailing list
[email protected]