Change by Jeroen Demeyer :
--
nosy: +jdemeyer
nosy_count: 4.0 -> 5.0
pull_requests: +26438
pull_request: https://github.com/python/cpython/pull/12607
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
I need to check, but I think this is a duplicate of bpo-35983, which still has
PR 12607 open.
--
___
Python tracker
<https://bugs.python.org/issue40
Jeroen Demeyer added the comment:
> I left some comments on the PR.
I don't see anything. Either I'm doing something wrong or GitHub is messed up.
--
___
Python tracker
<https://bugs.pytho
Jeroen Demeyer added the comment:
> It looks like a bug triggered on purpose.
Absolutely. It's one of the many small issues that I found while working on PEP
590 and related things.
--
___
Python tracker
<https://bugs.python.org
Jeroen Demeyer added the comment:
> As you say, we currently have only one usage of NotImplemented outside its
> intended purpose.
I know at least 3 in CPython, so it's not so rare to use NotImplemented for
something else than binary operators:
1. __subclasshook__
2. reducer_o
Jeroen Demeyer added the comment:
For the record: making a public math.as_integer_ratio() function was rejected
at #37822.
--
___
Python tracker
<https://bugs.python.org/issue37
Change by Jeroen Demeyer :
--
nosy: +jdemeyer
___
Python tracker
<https://bugs.python.org/issue37934>
___
___
Python-bugs-list mailing list
Unsubscribe:
Jeroen Demeyer added the comment:
> FWIW, the entire point of us having recently added as_integer_ratio() methods
> to so many concrete classes is to avoid the need for helper functions in
> favor of a simple try/except around a single call.
But what about PEP 3141? The fractions
Jeroen Demeyer added the comment:
May I propose PR 15327 as alternative? It solves some of the same issues as the
PR on this issue, in particular supporting arbitrary objects with
as_integer_ratio(). It also improves performance somewhat for certain inputs,
but that's more by acc
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +15093
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/15383
___
Python tracker
<https://bugs.python.org/issu
New submission from Jeroen Demeyer :
The special method __length_hint__ can return NotImplemented. In this case, the
result is as if the __length_hint__ method didn't exist at all. This behaviour
is implemented and tested but not documented.
--
assignee: docs@python
compo
Jeroen Demeyer added the comment:
> our needs tend to be much different from end-users
This issue is about fractions and statistics, which are closer to typical user
libraries than CPython libraries. In fact, both could easily be packages on
PyPI instead of part of the standard libr
Jeroen Demeyer added the comment:
> AFAICT, no end-user has ever requested this ever.
What do you mean with "this"?
(A) A public function like math.as_integer_ratio()
(B) Using as_integer_ratio() in the fractions.Fraction() constructor
(C) The optimization of the fract
Jeroen Demeyer added the comment:
> ISTM that small and probably unimportant opimizations shouldn't spill-over
> into API feature creep.
The way I see it, the optimization is besides the point here. Regardless of
performance, the added function is a useful feature to have to av
Jeroen Demeyer added the comment:
> Sorry, but I do not understand why adding Fraction.as_integer_ratio()
> prevents adding math.as_integer_ratio().
I also support a public function for that. It seems that we're planning this
"as_integer_ratio" thing to become public A
Jeroen Demeyer added the comment:
> I afraid this can slow down the Fraction constructor.
No, it doesn't! It even speeds up the constructor in some cases:
./python -m perf timeit --duplicate 200 -s 'from fractions import Fraction; x =
1' 'Fraction(x)'
BEFORE: M
Jeroen Demeyer added the comment:
> There is a 14% regression in creating a Fraction from an integer
Isn't that the main use case? I suggest to keep the special case for 'int' as
fast path to avoid this regression.
--
nosy: +jdemeyer
Jeroen Demeyer added the comment:
> See issue37884 which uses a C accelerator.
Note that that doesn't replace this issue, because I need to support
as_integer_ratio both in the *numerator* and *denominator*.
--
___
Python tracker
Change by Jeroen Demeyer :
--
pull_requests: +15046
pull_request: https://github.com/python/cpython/pull/15328
___
Python tracker
<https://bugs.python.org/issue37
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +15045
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/15327
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
> I'm wary of "%S" used in error messages.
Maybe you're misunderstanding something. The goal is not really to change error
messages, only the way how they are produced. For example, we currently have
>>> def f(): pass
>&g
Jeroen Demeyer added the comment:
> I'm wary of making error messages depend on the str representation of a
> function; that would prevent us from changing it later.
Why wouldn't we be able to change anything? Typically, the exact string of an
error message is NOT part
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Change by Jeroen Demeyer :
--
pull_requests: +15018
pull_request: https://github.com/python/cpython/pull/15295
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
> Maybe repr(func) should be left unchanged, but str(func) can be enhanced?
Yes, that is what I meant.
--
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
> My question is if it is safe to let developers "abuse" it. If these macros
> are misused, they can lead to a performance regression.
I expect people using these macros and PR reviewers to use good judgement when
to use these macros. Ther
Jeroen Demeyer added the comment:
I claim that adding Py_LIKELY/Py_UNLIKELY will reduce the performance
randomness of non-PGO builds. So it would be strange to use that randomness as
an argument *against* this patch.
--
___
Python tracker
<ht
Jeroen Demeyer added the comment:
This is essentially a duplicate of #36048. The example is now deprecated:
>>> from decimal import Decimal
>>> from datetime import datetime
>>> datetime(Decimal("2000.5"), 1, 2)
:1: DeprecationWarning: an integer i
Jeroen Demeyer added the comment:
We're not talking about prefetching here. The Py_LIKELY/Py_UNLIKELY macros only
affect which of the two code paths in a branch is the "default" one, i.e. the
one not involving a jmp.
--
___
Python
Jeroen Demeyer added the comment:
> If it's an optimization, can you show a benchmark to validate that it's
> really faster as expected?
Yes, I did test it. I didn't save the results, but I can redo them if you want.
If you plan to reject the issue anyway, there is no p
Jeroen Demeyer added the comment:
Maybe an even better idea would be to partially inline PyLong_FromLong(). If
the check for small ints in PyLong_FromLong() would be inlined, then the
compiler could optimize those checks. This would benefit all users of
PyLong_FromLong() without code
New submission from Jeroen Demeyer :
Currently, the fractions.Fraction constructor accepts an .as_integer_ratio()
method only for the specific types "float" and "Decimal". It would be good to
support this for arbitrary classes.
This is part of what was proposed in #37822
Jeroen Demeyer added the comment:
> IHMO PGO compilation already defeats the purpose of these macros.
That's certainly true. The question is whether we want to optimize also non-PGO
builds.
--
___
Python tracker
<https://bugs
Jeroen Demeyer added the comment:
Aren't you worried about using the non-special non-reserved attributes like
"as_integer_ratio"? That's the reason why I proposed a dunder name "__ratio__"
instead of "as_integer_ratio".
In my opinion, it was a mista
Jeroen Demeyer added the comment:
These functions are now officially deprecated, see PR 14804. So I think that
this issue can be closed.
--
nosy: +jdemeyer
___
Python tracker
<https://bugs.python.org/issue11
Jeroen Demeyer added the comment:
Another idea I had is to somehow deal with this in PyErr_WriteUnraisable:
whenever PyErr_WriteUnraisable is called for a KeyboardInterrupt, defer that
exception to a later time, for example when _PyEval_EvalFrameDefault() is
called
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14881
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/15144
___
Python tracker
<https://bugs.python.org/issu
New submission from Jeroen Demeyer :
Take the LIKELY/UNLIKELY macros out of Objects/obmalloc.c (renaming them of
course). Use them in a few places to micro-optimize vectorcall.
--
components: Interpreter Core
messages: 349108
nosy: Mark.Shannon, inada.naoki, jdemeyer
priority: normal
Jeroen Demeyer added the comment:
I agree with rejecting and closing this issue.
--
nosy: +jdemeyer
___
Python tracker
<https://bugs.python.org/issue33
Jeroen Demeyer added the comment:
> Should we add a note like "if you get a 'SystemError: bad call flags' on
> import, check the descriptor flags of your functions" in What's New in Python
> 3.8?
A better solution would be to change the error message. We
Jeroen Demeyer added the comment:
Please close
--
___
Python tracker
<https://bugs.python.org/issue37562>
___
___
Python-bugs-list mailing list
Unsubscribe:
Jeroen Demeyer added the comment:
See
https://discuss.python.org/t/pep-3141-ratio-instead-of-numerator-denominator/2037/24?u=jdemeyer
for a proposal to define a new dunder __ratio__ (instead of as_integer_ratio)
for this.
--
nosy: +jdemeyer
Jeroen Demeyer added the comment:
Another solution would be to change the __str__ of various function objects to
a prettier output. For example, we currently have
>>> def f(): pass
>>> print(f)
We could change this to
>>> def f(): pass
>>> print(f)
f
Jeroen Demeyer added the comment:
> If we want to support other numerical types with loss in double rounding, the
> most reliable way is to represent them as fractions (x.as_integer_ratio() or
> (x.numerator, x.denominator))
See
https://discuss.python.org/t/pep-3141-ratio-i
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14673
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14890
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
4. It uses the __name__ instead of the __qualname__
--
___
Python tracker
<https://bugs.python.org/issue37645>
___
___
Pytho
New submission from Jeroen Demeyer :
PyEval_GetFuncName is bad API because
1. It hardcodes a limited number of function classes (which doesn't even
include all function classes in the core interpreter) instead of supporting
duck-typing.
2. In case of a "function" object,
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14652
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14863
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
I support the patch proposed in https://bugs.python.org/file48478/pyport.h.diff
but it's not up to me to make that decision.
--
___
Python tracker
<https://bugs.python.org/is
Jeroen Demeyer added the comment:
> We have some reserved/deprecated/unused fields. Setting 0 to them makes
> forward incompatible code.
Good point. tp_print is removed in 3.9
--
___
Python tracker
<https://bugs.python.org/i
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14627
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14836
___
Python tracker
<https://bugs.python.org/issu
New submission from Jeroen Demeyer :
>>> class S(str):
... __eq__ = int.__eq__
>>> S() == S()
True
The expectation is that this raises an exception because int.__eq__() is called
on S instances.
--
components: Interpreter Core
messages: 348108
nosy: jdemeye
Jeroen Demeyer added the comment:
One possible solution would be to have a macro to suppress the tp_print field
in the first place. Something like
#ifndef PY_NO_TP_PRINT
/* bpo-37250: kept for backwards compatibility in CPython 3.8 only */
Py_DEPRECATED(3.8) int (*tp_print)(PyObject
Change by Jeroen Demeyer :
--
pull_requests: +14600
pull_request: https://github.com/python/cpython/pull/14804
___
Python tracker
<https://bugs.python.org/issue29
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14589
stage: needs patch -> patch review
pull_request: https://github.com/python/cpython/pull/14795
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
PR 14782 (backport of PR 13781) fixes the regression for me.
--
___
Python tracker
<https://bugs.python.org/issue37
Change by Jeroen Demeyer :
--
versions: -Python 3.9
___
Python tracker
<https://bugs.python.org/issue37562>
___
___
Python-bugs-list mailing list
Unsubscribe:
Change by Jeroen Demeyer :
--
pull_requests: +14578
pull_request: https://github.com/python/cpython/pull/14782
___
Python tracker
<https://bugs.python.org/issue36
Change by Jeroen Demeyer :
--
pull_requests: +14579
pull_request: https://github.com/python/cpython/pull/14782
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
I did some benchmarks WITHOUT PGO (simply because it's much faster to compile
and therefore easier to test things out).
The command I used for testing is
./python -m perf timeit --duplicate 200 -s 'f = len; x
Jeroen Demeyer added the comment:
See also
https://github.com/python/cpython/pull/14193#pullrequestreview-251630953
--
nosy: +jdemeyer
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
I will certainly have a look and try a few things, but it will probably be next
week.
--
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
Could you please specify:
- which commits are you comparing exactly? From your explanation, I guess
aacc77fbd and its parent, but that's not completely fair since PEP 590 consists
of many commits (see #36974). A better comparison would be master ag
Jeroen Demeyer added the comment:
I understand the arguments for not removing these functions. However, I still
think that we should deprecate them but without planning in advance when they
should be removed. Victor said that we should document these functions as
"please don'
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14492
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14685
___
Python tracker
<https://bugs.python.org/issu
New submission from Jeroen Demeyer :
We already have
_PyObject_CallNoArg()
_PyObject_CallOneArg()
_PyObject_CallMethodNoArgs()
so it makes sense to also add
_PyObject_CallMethodOneArg()
--
components: Interpreter Core
messages: 347619
nosy: inada.naoki, jdemeyer, vstinner
priority
Change by Jeroen Demeyer :
--
pull_requests: +14490
pull_request: https://github.com/python/cpython/pull/14684
___
Python tracker
<https://bugs.python.org/issue37
Change by Jeroen Demeyer :
--
pull_requests: +14488
pull_request: https://github.com/python/cpython/pull/14683
___
Python tracker
<https://bugs.python.org/issue29
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14487
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14682
___
Python tracker
<https://bugs.python.org/issu
New submission from Jeroen Demeyer :
Keyword names in calls are expected to be strings, however it's currently not
clear who should enforce/check this.
I suggest to fix this for vectorcall/METH_FASTCALL and specify that it's the
caller's job to make sure that keyword names a
Jeroen Demeyer added the comment:
> but it will make d1.update(**d2) slower with a complexity of O(n): d2 must be
> converted to 2 lists
This part is still true and it causes a slow-down of about 23% for
dict.update(**d), see benchmarks at
https://github.com/python/cpython/pull
Jeroen Demeyer added the comment:
> d2 must be converted to 2 lists (kwnames and args) and then a new dict should
> be created.
The last part is not necessarily true. You could do the update directly,
without having that intermediat
Change by Jeroen Demeyer :
--
pull_requests: +14418
pull_request: https://github.com/python/cpython/pull/14603
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
Any objections to closing this?
--
___
Python tracker
<https://bugs.python.org/issue36974>
___
___
Python-bugs-list mailin
Jeroen Demeyer added the comment:
One thing that keeps bothering me when using vectorcall for type.__call__ is
that we would have two completely independent code paths for constructing an
object: the new one using vectorcall and the old one using tp_call, which in
turn calls tp_new and
Jeroen Demeyer added the comment:
> Jeroen: hum, you both proposed a similar fix :-)
It seems that I lost the race ;-)
But seriously: if we both independently came up with the same solution, that's
a good sign that the solution is
Change by Jeroen Demeyer :
--
pull_requests: +14415
pull_request: https://github.com/python/cpython/pull/14600
___
Python tracker
<https://bugs.python.org/issue37
Change by Jeroen Demeyer :
--
pull_requests: +14407
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14589
___
Python tracker
<https://bugs.python.org/issu
Change by Jeroen Demeyer :
--
pull_requests: +14406
pull_request: https://github.com/python/cpython/pull/14588
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
You are correct that PyDict_Merge() does not need to recompute the hashes of
the keys. However, your example doesn't work because you need string keys for
**kwargs. The "str" class caches its hash, so you would need a dict with a
"str&
Jeroen Demeyer added the comment:
> How can we avoid unpacking dict in case of d1.update(**d2)?
The unpacking is only a problem if you insist on using PyDict_Merge(). It would
be perfectly possible to implement dict merging from a tuple+vector instead of
from a dict. In that case, th
Jeroen Demeyer added the comment:
Above, I meant #37207 or PR 13930.
--
___
Python tracker
<https://bugs.python.org/issue29312>
___
___
Python-bugs-list mailin
Jeroen Demeyer added the comment:
> How can we avoid unpacking dict in case of d1.update(**d2)?
We cannot. However, how common is that call? One could argue that we should
optimize for the more common case of d1.update(d2).
--
___
Python trac
Change by Jeroen Demeyer :
--
pull_requests: +14405
pull_request: https://github.com/python/cpython/pull/11636
___
Python tracker
<https://bugs.python.org/issue22
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Jeroen Demeyer added the comment:
For the benefit of PR 37207, I would like to re-open this discussion. It may
have been rejected for the wrong reasons. Victor's patch was quite inefficient,
but that's to be expected: msg285744 mentions a two-step process, but during
the disc
Change by Jeroen Demeyer :
--
pull_requests: +14394
pull_request: https://github.com/python/cpython/pull/14575
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
Test of stack usage:
from _testcapi import stack_pointer
class D(dict):
def __missing__(self, key):
sp = stack_pointer()
print(f"stack usage = {TOP - sp}")
return None
d = D()
TOP = stack_pointer()
d[0]
**before**: s
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14393
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14575
___
Python tracker
<https://bugs.python.org/issu
New submission from Jeroen Demeyer :
Try to use _PyObject_CallNoArg in all places where a function is called without
arguments.
--
components: Interpreter Core
messages: 347230
nosy: jdemeyer
priority: normal
severity: normal
status: open
title: Use _PyObject_CallNoArg() in a few more
Jeroen Demeyer added the comment:
Victor, what's your opinion on adding PyObject_CallOneArg() to the limited API?
--
___
Python tracker
<https://bugs.python.org/is
Jeroen Demeyer added the comment:
Stefan: I used an underscore by analogy with
PyObject_CallNoArgs()/_PyObject_CallNoArg(), where the first is in the limited
API and the second is an inline function in the cpython API.
But maybe we could revisit that decision
Jeroen Demeyer added the comment:
> Is there any benchmark showing if it's faster
Here is one example:
class D(dict):
def __missing__(self, key):
return None
d = D()
and now benchmark d[0]
**before**: Mean +- std dev: 173 ns +- 1 ns
**after**: Mean +- std dev: 162 ns
Jeroen Demeyer added the comment:
> It adds yet another special case underscore function that one cannot use in
> external projects. So I would not say that is simpler.
If you're worried about the underscore, I will make a separate PR to add a
non-underscored version,
Jeroen Demeyer added the comment:
Exactly. I see no reason to prefer PyObject_CallFunctionObjArgs(func, arg,
NULL) over _PyObject_CallOneArg(func, arg)
--
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
> PEP 7 uses C99 since Python 3.6:
> https://www.python.org/dev/peps/pep-0007/#c-dialect
That's not what the PEP says: "Python versions greater than or equal to 3.6 use
C89 with several select C99 features"
"several select C99 fe
Jeroen Demeyer added the comment:
> _PyObject_CALL_WITH_ARGS(func, PyDict_GetItem(d, key)); // PyDict_GetItem(d,
> key) is called twice.
Actually, it's not a problem: sizeof() is special, it only looks at the type of
its argument, it doesn't
1 - 100 of 691 matches
Mail list logo