[issue25299] TypeError: __init__() takes at least 4 arguments (4 given)

2019-08-17 Thread A. Skrobov
A. Skrobov added the comment: Joannah, I see that under #25314, the docs were updated to match the implementation: https://github.com/python/cpython/commit/b4912b8ed367e540ee060fe912f841cc764fd293 On the other hand, the discussion here (from 2015) and on #25314 (from 2016) includes

[issue36256] parser module fails on legal input

2019-04-05 Thread A. Skrobov
A. Skrobov added the comment: Is it intentional that the fix is not backported to 3.6 as well? -- ___ Python tracker <https://bugs.python.org/issue36

[issue36440] more helpful diagnostics for parser module

2019-03-26 Thread A. Skrobov
A. Skrobov added the comment: > Nothing was really "decided", just that meanwhile is better not to ship a > broken parser module. Totally true, but the issue is closed and resolved, meaning that no one will ever look at it again. -- __

[issue36440] more helpful diagnostics for parser module

2019-03-26 Thread A. Skrobov
New submission from A. Skrobov : Seeing that the implicit resolution at #36256 was to keep the parser module in place, may I suggest that the diagnostics it produces be improved, so that instead of "Expected node type 305, got 11", it would raise "Expected namedexpr_

[issue36440] more helpful diagnostics for parser module

2019-03-26 Thread A. Skrobov
Change by A. Skrobov : -- keywords: +patch pull_requests: +12510 stage: -> patch review ___ Python tracker <https://bugs.python.org/issue36440> ___ ___ Py

[issue36256] parser module fails on legal input

2019-03-11 Thread A. Skrobov
A. Skrobov added the comment: > The major problem with the parser module is that is unsynchronized with the > actual parser The parser module is "sort of" synchronised with the actual parser, in that it uses the same _PyParser_Grammar; this doesn't mean they always

[issue36256] parser module fails on legal input

2019-03-10 Thread A. Skrobov
New submission from A. Skrobov : Under #26526, I had optimised the validation code in parser module to use the actual parser DFAs, but my code considers only the token types in the input, and doesn't distinguish between different NAMEs (e.g. different keywords). The result is this: Python

[issue26415] Excessive peak memory consumption by the Python parser

2018-12-14 Thread A. Skrobov
A. Skrobov added the comment: I've run pyperformance (0.7.0) with my updated patch, and posted results at the PR page. They look encouraging enough. -- ___ Python tracker <https://bugs.python.org/issue26

[issue26415] Excessive peak memory consumption by the Python parser

2018-12-06 Thread A. Skrobov
A. Skrobov added the comment: @Serhiy: incredibly, this patch from 2.5 years ago required very minor changes to apply to the latest master. Shows how ossified the parser is :-) Now posted as https://github.com/python/cpython/pull/10995

[issue26415] Excessive peak memory consumption by the Python parser

2017-07-24 Thread A. Skrobov
A. Skrobov added the comment: To bump this year-old issue, I have delivered a talk at EuroPython 2017, explaining what my patch does, how it does what it does, and why it's a good thing to do. You can watch my talk at https://www.youtube.com/watch?v=dyRDmcsTwhE=1h52m38s

[issue26526] In parsermodule.c, replace over 2KLOC of hand-crafted validation code, with a DFA

2017-01-25 Thread A. Skrobov
A. Skrobov added the comment: Oh btw, the comment in the beginning of Grammar/Grammar > # Note: Changing the grammar specified in this file will most likely > #require corresponding changes in the parser module > #(../Modules/parsermodule.c). is no longer tr

[issue26415] Excessive peak memory consumption by the Python parser

2016-09-05 Thread A. Skrobov
A. Skrobov added the comment: Updated the comment for Init_ValidationGrammar() -- Added file: http://bugs.python.org/file44370/patch ___ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/i

[issue26415] Excessive peak memory consumption by the Python parser

2016-08-15 Thread A. Skrobov
A. Skrobov added the comment: Xavier, the big picture description of my patch is in http://bugs.python.org/file43665/devguide_patch The heap fragmentation was observed by Victor, not by myself. Victor, could you please create a new ticket for your python_memleak.py reproducer

[issue26526] In parsermodule.c, replace over 2KLOC of hand-crafted validation code, with a DFA

2016-08-15 Thread A. Skrobov
A. Skrobov added the comment: Thanks Xavier! Yes, this is the same DFA that's used by the main Python parser. For some reason, parsermodule didn't previously reuse it, but instead did its own thing. Any volunteers to review the other patch for Python parser, at http://bugs.python.org

[issue26415] Excessive peak memory consumption by the Python parser

2016-07-08 Thread A. Skrobov
A. Skrobov added the comment: Fixing whitespace in the patch, and including an update for the docs -- Added file: http://bugs.python.org/file43664/patch ___ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/i

[issue26415] Excessive peak memory consumption by the Python parser

2016-07-08 Thread A. Skrobov
Changes by A. Skrobov <tyomi...@gmail.com>: Added file: http://bugs.python.org/file43665/devguide_patch ___ Python tracker <rep...@bugs.python.org> <http://bugs.python

[issue26415] Excessive peak memory consumption by the Python parser

2016-06-07 Thread A. Skrobov
A. Skrobov added the comment: (Updating the issue title, to avoid confusion between two orthogonal concerns) -- title: Fragmentation of the heap memory in the Python parser -> Excessive peak memory consumption by the Python parser ___ Python trac

[issue26415] Fragmentation of the heap memory in the Python parser

2016-06-07 Thread A. Skrobov
A. Skrobov added the comment: My current patch avoids the memory peak *and* doesn't add any memory fragmentation on top of whatever is already there. In other words, it makes the parser better in this one aspect, and it doesn't make it worse in any aspect

[issue26415] Fragmentation of the heap memory in the Python parser

2016-06-07 Thread A. Skrobov
A. Skrobov added the comment: An arena might help reclaim the memory once the parsing is complete, but it wouldn't reduce the peak memory consumption by the parser, and so it wouldn't prevent a MemoryError when parsing a 35MB source on a PC with 2GB of RAM

[issue26415] Fragmentation of the heap memory in the Python parser

2016-06-06 Thread A. Skrobov
A. Skrobov added the comment: Now that #26526 landed (thanks to everybody involved!), I'm requesting a review on an updated version of my patch, which addresses the excessive memory consumption by the parser. The description of my original patch still applies: > The attached pa

[issue26526] In parsermodule.c, replace over 2KLOC of hand-crafted validation code, with a DFA

2016-05-31 Thread A. Skrobov
Changes by A. Skrobov <tyomi...@gmail.com>: -- keywords: +patch Added file: http://bugs.python.org/file43069/issue26526_16704_63395.diff ___ Python tracker <rep...@bugs.python.org> <http://bugs.python

[issue26526] In parsermodule.c, replace over 2KLOC of hand-crafted validation code, with a DFA

2016-05-31 Thread A. Skrobov
A. Skrobov added the comment: Thank you Fred for your review! I don't have commit access myself; can anybody please commit it for me? -- ___ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/i

[issue26415] Fragmentation of the heap memory in the Python parser

2016-05-12 Thread A. Skrobov
A. Skrobov added the comment: Ping? This patch is two months old now. -- ___ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue26415> ___ __

[issue26526] In parsermodule.c, replace over 2KLOC of hand-crafted validation code, with a DFA

2016-05-12 Thread A. Skrobov
A. Skrobov added the comment: Ping? This patch is two months old now. -- ___ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue26526> ___ __

[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-10 Thread A. Skrobov
A. Skrobov added the comment: I've now tried it with "perf.py -r -m", and the memory savings are as follows: ### 2to3 ### Mem max: 45976.000 -> 47440.000: 1.0318x larger ### chameleon_v2 ### Mem max: 436968.000 -> 401088.000: 1.0895x smaller ### django_v3 ### Mem max: 23808.

[issue26526] In parsemodule.c, replace over 2KLOC of hand-crafted validation code, with a DFA

2016-03-10 Thread A. Skrobov
New submission from A. Skrobov: Updating Modules/parsermodule.c for every change in grammar and/or parser is a maintenance burden, listed as such at https://docs.python.org/devguide/grammar.html The attached patch lets the validation code use the auto-generated DFA structures, thus ensuring

[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-09 Thread A. Skrobov
A. Skrobov added the comment: The attached patch for the parser reduces "Maximum resident set size (kbytes)" threefold, for the degenerate example of 'import ast; ast.parse("0,"*100, mode="eval")', by eliminating many CST nodes that have a single child. Acco

[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-08 Thread A. Skrobov
A. Skrobov added the comment: @Serhiy: if your build is 32-bit, then every node is half the size, as it mostly consists of pointers. The amount of heap fragmentation can also depend on gcc/glibc version. -- ___ Python tracker <

[issue26415] Out of memory, trying to parse a 35MB dict

2016-03-08 Thread A. Skrobov
A. Skrobov added the comment: OK, I've now looked into it with a fresh build of 3.6 trunk on Linux x64. Peak memory usage is about 3KB per node: $ /usr/bin/time -v ./python -c 'import ast; ast.parse("0,"*100, mode="eval")' Command being timed: "./python -c

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread A. Skrobov
A. Skrobov added the comment: Yes, I understand that this is a matter of memory consumption, which is why I submitted this ticket as "resource usage". What I don't understand is, what could possibly require gigabytes of memory for

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread A. Skrobov
A. Skrobov added the comment: My Python is 64-bit, but my computer only has 2GB physical RAM. -- ___ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/i

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-25 Thread A. Skrobov
A. Skrobov added the comment: Mine is on Windows. I've now installed both 2.7.10 and 3.4.3 to reconfirm, and it's still the same, on both of them, except that on 3.4.3 it crashes with a MemoryError much faster (within a couple minutes). -- components: +Windows nosy: +paul.moore

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-24 Thread A. Skrobov
A. Skrobov added the comment: A practical note: if, instead of importing crash.py, I do a json.loads, with a few extra transformations: with open("crash.py") as f: holo_table={tuple(int(z) for z in k.split(', ')):v for k,v in json.loads(f.readlines()[0][13:].replace('(','&

[issue26415] Out of memory, trying to parse a 35MB dict

2016-02-22 Thread A. Skrobov
New submission from A. Skrobov: I have a one-line module that assigns a tuple->int dictionary: holo_table = {(0, 0, 0, 0, 0, 0, 1, 41, 61, 66, 89): 9, (0, 0, 0, 70, 88, 98, 103, 131, 147, 119, 93): 4, [35MB skipped], (932, 643, 499, 286, 326, 338, 279, 200, 280, 262, 115): 5} When I

[issue25299] TypeError: __init__() takes at least 4 arguments (4 given)

2015-10-02 Thread A. Skrobov
New submission from A. Skrobov: Python 2.7.3 (default, Dec 18 2014, 19:10:20) [GCC 4.6.3] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> from argparse import ArgumentParser >>> parser = ArgumentPa

[issue25299] TypeError: __init__() takes at least 4 arguments (4 given)

2015-10-02 Thread A. Skrobov
A. Skrobov added the comment: Thank you for confirming that the mismatch between the documentation and the behaviour is preserved in Python 3! Adding it to the list of affected versions. -- versions: +Python 3.5 ___ Python tracker <

[issue23592] SIGSEGV on interpreter shutdown, with daemon threads running wild

2015-03-06 Thread A. Skrobov
A. Skrobov added the comment: That's right; and working around this issue, by taming the daemon threads a bit, wasn't too difficult. Still, if the daemon threads are part of the language, they shouldn't crash the interpreter process, I suppose

[issue23592] SIGSEGV on interpreter shutdown, with daemon threads running wild

2015-03-05 Thread A. Skrobov
New submission from A. Skrobov: I'm observing that this line of code: https://hg.python.org/cpython/file/ec9bffc35cad/Python/ceval.c#l3010 -- causes a SIGSEGV on interpreter shutdown, after running some really convoluted Python code with daemon threads running wild. At the time of the crash