[issue26415] Out of memory, trying to parse a 35MB dict

2016-03-08 Thread STINNER Victor
STINNER Victor added the comment: > So, apparently, it's not the nodes themselves taking up a disproportionate > amount of memory -- it's the heap getting so badly fragmented that 89% of its > memory allocation is wasted. Yeah, the Python parser+compiler badly uses the memory allocator. See my

[issue26415] Out of memory, trying to parse a 35MB dict

2016-03-08 Thread A. Skrobov
A. Skrobov added the comment: OK, I've now looked into it with a fresh build of 3.6 trunk on Linux x64. Peak memory usage is about 3KB per node: $ /usr/bin/time -v ./python -c 'import ast; ast.parse("0,"*100, mode="eval")' Command being timed: "./python -c import ast; ast.parse("0,

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-27 Thread Serhiy Storchaka
Serhiy Storchaka added the comment: I though this might be tokenizer issue, but this is just large memory consumption for AST tree. Simpler example: ./python -c 'import ast; ast.parse("0,"*100, mode="eval")' It takes over 450MB of memory on 32-bit system, over 450 bytes per tuple item. In

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread A. Skrobov
A. Skrobov added the comment: Yes, I understand that this is a matter of memory consumption, which is why I submitted this ticket as "resource usage". What I don't understand is, what could possibly require gigabytes of memory for this task? -- ___

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread Eryk Sun
Eryk Sun added the comment: > My Python is 64-bit, but my computer only has 2GB physical RAM. That explains why it takes half an hour to crash. It's thrashing on page faults. Adding another paging file or increasing the size of your current paging file should allow this to finish parsing... ev

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread A. Skrobov
A. Skrobov added the comment: My Python is 64-bit, but my computer only has 2GB physical RAM. -- ___ Python tracker ___ ___ Python-bug

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread Eryk Sun
Eryk Sun added the comment: I don't think this is Windows related. Are you using 32-bit Python? On Linux, if I limit the process address space to 2 gigs, it crashes almost immediately: $ ulimit -v 200 $ python-dbg -c 'import crash' Segmentation fault It runs out of memory while

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread A. Skrobov
A. Skrobov added the comment: Mine is on Windows. I've now installed both 2.7.10 and 3.4.3 to reconfirm, and it's still the same, on both of them, except that on 3.4.3 it crashes with a MemoryError much faster (within a couple minutes). -- components: +Windows nosy: +paul.moore, steve.

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread Christian Heimes
Christian Heimes added the comment: $ python2.7 --version Python 2.7.10 $ python3.4 --version Python 3.4.3 $ uname -r -o 4.3.5-300.fc23.x86_64 GNU/Linux Intel(R) Core(TM) i7-4900MQ CPU @ 2.80GHz -- ___ Python tracker

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread Christian Heimes
Christian Heimes added the comment: It takes about 12 seconds to byte compile crash.py on my computer and less than half a second to import about 28 MB of byte code: $ rm -rf crash.pyc __pycache__/ $ time python2.7 -c 'import crash' real0m11.930s user0m9.859s sys 0m2.085s $ time p

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-24 Thread Raymond Hettinger
Changes by Raymond Hettinger : -- nosy: +rhettinger ___ Python tracker ___ ___ Python-bugs-list mailing list Unsubscribe: https://mai

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-24 Thread A. Skrobov
A. Skrobov added the comment: A practical note: if, instead of importing crash.py, I do a json.loads, with a few extra transformations: with open("crash.py") as f: holo_table={tuple(int(z) for z in k.split(', ')):v for k,v in json.loads(f.readlines()[0][13:].replace('(','"').replace(')','"'))

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-24 Thread Serhiy Storchaka
Changes by Serhiy Storchaka : -- assignee: -> serhiy.storchaka nosy: +serhiy.storchaka ___ Python tracker ___ ___ Python-bugs-list ma

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-22 Thread A. Skrobov
New submission from A. Skrobov: I have a one-line module that assigns a tuple->int dictionary: holo_table = {(0, 0, 0, 0, 0, 0, 1, 41, 61, 66, 89): 9, (0, 0, 0, 70, 88, 98, 103, 131, 147, 119, 93): 4, [35MB skipped], (932, 643, 499, 286, 326, 338, 279, 200, 280, 262, 115): 5} When I try to im