I would note that the accepted [PEP 626](https://peps.python.org/pep-0626/)
explicitly constrains line-tracing behavior: """Python should guarantee that
when tracing is turned on, “line” tracing events are generated for all lines of
code executed and only for lines of code that are executed."""
I've heard "evaluation map" for a related mathematical concept: the natural map
from X to (X -> Y) -> Y in some cartesian closed category (whatever that means
:-), like the natural embedding of a vector space into its double dual space,
or like this sort of eval_at function that you can then plu
The non-associativity isn't just signed zeros: regular floating point addition
and multiplication of positive numbers is not associative
from math import nextafter
x = nextafter(1.0, 2.0)
print((x+x)+1.0, x+(x+1.0))
x2 = nextafter(1.0, 0.0)
print((x*x2)*x2, x*(x2*x2))
I don't know how common it is to want to distinguish between string
representations of integers and those of floats. It seems like a function that
could have a million little variations: What if my boss says that now my input
integers can have commas? Or have extra space added? Or if integer-val
IIUC a lot of what is being discussed has been implemented by the "placeholder"
package on PyPI
Here: https://pypi.org/project/placeholder/
It allows using things like `min(data, key=_[-1])` or `_.age < 18` (just using
language features instead of adding new syntax).
___
Maybe I'm missing something, but why do you need the SimpleNamespace at all?
Why not make your own mapping as in
class StringMapper:
...
def __getitem__(self, s):
# Whatever arbitrary behavior you want
# Process suffixes, etc here, for example:
Whenever you extend the definition of an operation (`__iter__` in this case) to
more existing objects, you lose a little bit of the ability to catch errors
early. Consider the function:
def traverse(something):
for x in something:
# do stuff
...
If you accide
Hm maybe there is something worthwhile here. I implemented a Python version
with the same semantics as itertools.product, except only consuming the
iterators when absolutely necessary; still only consuming each iterator once,
but building its pools incrementally. See
https://gist.github.com/sw
I'm assuming things like this are what you're talking about:
def lazy_product2(*args, repeat=1):
"recursive algorithm"
args = args * repeat
if not args:
yield ()
return
for all_but_last in lazy_product2(*args[:-1]):
for last i
Does this work?
>>> tuple[(int,) * 7 + (str,)]
tuple[int, int, int, int, int, int, int, str]
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/l
I've thought about this before too. But if I remember correctly, most
real-world path-finding algorithms can use a heap without an index to achieve
basically the same results: with this approach, the heap can now store more
than one copy of a node, but it doesn't need to put all of the nodes in
MRAB wrote:
> The assertions could still fail because there's nothing there to say
> that a0, b0 and c0 are strings, or, indeed, that there isn't a comma in
> one of them.
> .
That's true. But that's a weakness of parsing any ambiguous pattern, even with
regular expressions. It would be up to t
For some explanation, see this StackOverflow answer by Raymond Hettinger:
https://stackoverflow.com/a/16645307/11461120
___
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.py
What if the mapping assignment were more harmonious with the pattern matching
PEP? Something like this:
items = {'eggs': 2, 'cheese': 3, 'spam': 1}
{'eggs': eggs, 'spam': i_dont_need_to_name_this_spam, **rest} = items
assert i_dont_need_to_name_this_spam == 1
assert eggs == 2 and
I think it would be easiest to reason about if an Exception is always raised
when not everything is assigned to. Just like the behavior of other unpacking
assignment, either everything is assigned or there's an error. My apologies if
that wasn't clear from the examples.
_
I was definitely not proposing "spooky action at a distance". My proposal is to
existing f-strings as the `__setattr__` protocol is to the `__getattr__`
protocol. The template would only ever be hard-coded inline during the f-string
assignment. This is the same restriction that existing f-string
TL;DR: I propose the following behavior:
>>> s = "She turned me into a newt."
>>> f"She turned me into a {animal}." = s
>>> animal
'newt'
>>> f"A {animal}?" = s
Traceback (most recent call last):
File "", line 1, in
f"A {animal}?" = s
ValueError: f-str
17 matches
Mail list logo