Valentin Kuznetsov <vkuz...@gmail.com> added the comment:

Hi,
I'm sorry for delay, I was busy. Here is a test data file:
http://www.lns.cornell.edu/~vk/files/mangled.json

Its size is 150 MB, 50MB less of original, due to scrambled values I was 
forced to do.

The tests with stock json module in python 2.6.2 is 2GB
source = open('mangled.json', 'r')
data = json.load(source)

Using simplejson 2.0.9 from PyPi I saw the same performance, please note 
_speedups.so C module was compiled.

Using cjson module, I observed 180MB of RAM utilization
source = open('mangled.json', 'r')
data = cjson.encode(source.read())

cjson is about 10 times faster!

I re-factor code which deals with XML version of the same data and I was 
able to process it using cElementTree only using 20MB (!) of RAM.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue6594>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to