On 19 August 2016 at 18:27, Eric V. Smith <e...@trueblade.com> wrote:
> For something else that would become significantly more complicated to
> implement, you need look no further than the stdlib's own tokenizer module.
> So Python itself would require changes to parsers/lexers in Python/ast.c,
> IDLE, and Lib/tokenizer.py. In addition it would require adding tokens to
> Include/tokens.h and the generated Lib/token.py, and everyone using those
> files would need to adapt.
>
> Not that it's impossible, of course. But don't underestimate the amount of
> work this proposal would cause to the many places in and outside of Python
> that examine Python code.

And if folks want to do something more clever than regex based single
colour string highlighting, Python's own AST module is available to
help them out:

    >>> tree = ast.parse("f'example{parsing:formatting}and trailer'")
    >>> ast.dump(tree)
    "Module(body=[Expr(value=JoinedStr(values=[Str(s='example'),
FormattedValue(value=Name(id='parsing', ctx=Load()), conversion=-1,
format_spec=Str(s='formatting')), Str(s='and trailer')]))])"

Extracting the location of the field expression for syntax highlighting:

    >>> ast.dump(tree.body[0].value.values[1].value)
    "Name(id='parsing', ctx=Load())"

(I haven't shown it in the example, but AST nodes have lineno and
col_offset fields so you can relocate the original source code for
processing)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
_______________________________________________
Python-ideas mailing list
Python-ideas@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to