> Did you change any #defines in the source code?  Is this a clean 3.6.1. build?


#define VG_N_SEGMENTS 50000

Plus the failed attempt at extending valgrind to use more than 32GB

#  if VG_WORDSIZE == 8
///gczajkow     aspacem_maxAddr = (Addr)0x800000000 - 1; // 32G
///http://thread.gmane.org/gmane.comp.debugging.valgrind/7584/focus=7602
     aspacem_maxAddr = (Addr)(0x800000000ULL << 2) - 1; // 128GB
#  define N_PRIMARY_BITS  21

Otherwise valgrind errors out with

==7560==     Valgrind's memory management: out of memory:
==7560==        newSuperblock's request for 4194304 bytes failed.
==7560==        33731608576 bytes have already been allocated.
==7560==     Valgrind cannot continue.  Sorry.

Our processes under valgrind consume more than 32GB of memory, how can it be 
expanded to 128GB?

Thanks
Greg


------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security 
threats, fraudulent activity, and more. Splunk takes this data and makes 
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2d-c2
_______________________________________________
Valgrind-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/valgrind-users

Reply via email to