investigate large heap size when writing many objects to an avro data file 
(Python)
-----------------------------------------------------------------------------------

                 Key: AVRO-429
                 URL: https://issues.apache.org/jira/browse/AVRO-429
             Project: Avro
          Issue Type: Bug
          Components: python
    Affects Versions: 1.3.0
            Reporter: R. Tyler Ballance


Logging ~13k entries via the Python client, seeing abnormally large memory usage
{{
>>> from guppy import hpy; hp=hpy(); hp.heap()
Partition of a set of 127501 objects. Total size = 18102576 bytes.
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
     0  58443  46  5227392  29   5227392  29 str
     1  31964  25  2723312  15   7950704  44 tuple
     2    422   0  1118096   6   9068800  50 dict of module
     3   1209   1  1007064   6  10075864  56 dict (no owner)
     4   1111   1  1002608   6  11078472  61 type
     5   8253   6   990360   5  12068832  67 function
     6   8225   6   987000   5  13055832  72 types.CodeType
     7   1111   1   797992   4  13853824  77 dict of type
     8    377   0   358232   2  14212056  79 dict of class
     9    103   0   336040   2  14548096  80 dict of 
django.db.models.fields.CharField
<448 more rows. Type e.g. '_.more' to view.>
>>>
}}


-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to