[Antoine]
I have made two last-minute changes to the PEP:
- addition of the FRAME opcode, as discussed with Tim, and keeping a
fixed 8-byte frame size
Cool!
- addition of the MEMOIZE opcode, courtesy of Alexandre, which replaces
PUT opcodes in protocol 4 and helps shrink the size of
[Alexandre Vassalotti]
Looking at the different options available to us:
1A. Mandatory framing
(+) Allows the internal buffering layer of the Unpickler to rely
on the presence of framing to simplify its implementation.
(-) Forces all implementations of pickle to include
[Christian Heimes]
the buildbots are flaky because two repr() tests for userdict and
functools.partial fail every now and then. The test cases depend on a
fixed order of keyword arguments the representation of userdict and
partial instances. The improved hash randomization of PEP 456 shows its
[guido]
http://hg.python.org/cpython/rev/6bee0fdcba39
changeset: 87468:6bee0fdcba39
user:Guido van Rossum gu...@python.org
date:Sat Nov 23 15:09:16 2013 -0800
summary:
asyncio: Change bounded semaphore into a subclass, like
threading.[Bounded]Semaphore.
files:
[Brett]
On 2008-12-03, Python 3.0.0 was released by Barry.
Dang - nobody ever tells me anything. Congratulations! It's about
time 3.0.0 was released ;-)
...
Thanks to those in the community who stuck by the dev team and had faith
we knew what we were doing and have continued to help
[Barry]
...
I don't think the API *has* to change in a backward incompatible way either.
The methods could be given **kws with a bit of hackery to figure out whether
the old API was being used (keys: int, default, maxwidth) or the new API was
being used (keys: _int and _maxwidth). Yeah it's
[Daniel Holth]
But who could forget njzrs' wasp UAV software line 107, using
int=float?
https://github.com/nzjrs/wasp/blob/master/sw/groundstation/wasp/__init__.py#L107
I could forget it ;-) The remarkable thing about the two instances of:
random.randrange(0.0,1.0, int=float)
in that
[Dan Stromberg]
I keep hearing naysayers, nay saying about Python 3.x.
Here's a 9 question, multiple choice survey I put together about
Python 2.x use vs Python 3.x use.
I'd be very pleased if you could take 5 or 10 minutes to fill it out.
If you run Python 3 while filling out the survey,
[Benjamin Peterson]
...
This is the first time I ever installed a version of Python which
caused something called MSIEXEC.EXE
msiexec.exe is not part of the Python download.. msiexec.exe is part
of the Windows operating system, and is precisely the program that
installs .msi files (which the
[Bob Hanson]
...
Didn't think this likely, but I have now quintuple-checked
everything again. Everything says I have the real McCoy
msiexec.exe in its proper location -- just upgraded another app
which used MSI installers and it went as per normal.
That sounds most likely to me too ;-)
[Bob Hanson]
Forgive me, but I'm an old man with very poor vision. Using my
magnifying glass, I see it is two very long URLs ending with
something like after the blah-blah: ... akametechnology.com
More precisely, these two IP addresses:
23.59.190.113:80
23.59.190.106:80
So:
[Bob Hanson]
... magnifying glass, I see it is two very long URLs ending with
something like after the blah-blah: ... akametechnology.com
[Stephen J. Turnbull]
I suppose you tried cutting and pasting? Note that you don't need to
be exact as long as you're pretty sure you got the whole thing
The behavior of None in comparisons is intentional in Python 3. You
can agitate to change it, but it will probably die when Guido gets
wind of it ;-)
The catch-all mixed-type comparison rules in Pythons 1 and 2 were only
intended to be arbitrary but consistent. Of course each specific
release
[M.-A. Lemburg]
...
None worked as compares less than all other objects simply due
to the fact that None is a singleton and doesn't implement the
comparison slots (which for all objects not implementing rich
comparisons, meant that the fallback code triggered in Python 2).
And the fallback
[Tim]
Guido wanted to drop all the arbitrary but consistent mixed-type
comparison crud for Python 3.
[Greg Ewing]
Nobody is asking for a return to the arbitrary-but-
[in]consistent mess of Python 2, only to bring
back *one* special case, i.e. None comparing less
than everything else.
Of
[Raymond Hettinger]
I'm hoping that core developers don't get caught-up in the doctests are bad
meme.
Instead, we should be clear about their primary purpose which is to test
the examples given in docstrings.
I disagree.
In many cases, there is a great deal of benefit to docstrings that
[Mark Dickinson]
It occurs to me that any doctests that depend on the precise form of
repr(x) are, in a sense, already broken, since 2.x makes no guarantees
about repr(x) being consistent across platforms.
The doctest documentation has warned about this forever (look near the
end of the
[Mark Dickinson]
By the way, here's an example of an *almost* real-life use of million digit
calculations.
For an elementary number theory course that I taught a while ago, there
was an associated (optional) computer lab, where the students used
Python to investigate various ideas,
[Mark Dickinson]
I'd like to request that Stefan Krah be granted commit privileges to the
Python
svn repository, for the sole purpose of working on a (yet to be created)
py3k-decimal-in-c branch.
+1. I haven't commented on any of this, but I've watched it, and
Stefan appears easy enough to
[Aahz a...@pythoncraft.com]
[I'm nomail -- Cc me if you care whether I see followups]
https://github.com/BonzaiThePenguin/WikiSort/tree/master
WikiSort is a stable bottom-up in-place merge sort based on the work
described in Ratio based stable in-place merging, by Pok-Son Kim and
There's been a bit of serious study on this. The results are still
open to interpretation, though ;-) Here's a nice summary:
http://whathecode.wordpress.com/2011/02/10/camelcase-vs-underscores-scientific-showdown/
of-course-dashes-are-most-natural-ly y'rs - tim
On Thu, Apr 24, 2014 at 11:25
[Tim]
There's been a bit of serious study on this. The results are still
open to interpretation, though ;-) Here's a nice summary:
http://whathecode.wordpress.com/2011/02/10/camelcase-vs-underscores-scientific-showdown/
[Terry Reedy]
The linked poll is almost evenly split, 52% to 48% for
[Raymond Hettinger]
...
I'm not all at comfortable with the wording of the second sentence.
I was the author of the SystemRandom() class and I only want
to guarantee that it provides access to the operating system's
source of random numbers. It is a bold claim to guarantee that
it is
[Ben Hoyt]
I was emailing someone today about implementing something (for PEP
471, as it happens) and wanted to link to the Zen of Python [1] and
note a particular clause (in this case If the implementation is hard
to explain, it's a bad idea.). However, there are no clause numbers,
so you
[nha pham phq...@gmail.com]
Statement_1: With an array of size N or less than N, we need at most log2(N)
comparisons to find a value
(or a position, incase the search miss), using the binary search algorithm.
proof: This statement is trivia, and I believe, someone outthere already
proved it.
[nha pham phq...@gmail.com]
Thank you very much. I am very happy that I got a reply from Tim Peter.
My pleasure to speak with you too :-)
You are correct, my mistake.
The python code should be:
for i in range(low+1,high): //because we already add
data[low]
x =
[Tim]
1. Merge 2 at a time instead of just 1. That is, first sort the
next 2 elements to be merged (1 compare and a possible swap). Then
binary search to find where the smaller belongs, and a shorter binary
search to find where the larger belongs. Then shift both into place.
[Armin]
Good
[delightful new insight elided, all summarized by what remains ;-) ]
[Tim]
What somedatetime+timedelta really does is simpler than that: it
adds the number of microseconds represented by the timedelta to
somedatetime,
[Lennart]]
No it doesn't.
Lennart, I wrote the code. Both the Python
[Ronald Oussoren]
I totally agree with that, having worked on applications
that had to deal with time a lot and including some where the
end of a day was at 4am the following day. That app never
had to deal with DST because not only are the transitions at
night, the are also during the
[Lennart Regebro rege...@gmail.com]
Of course, I meant datetime objects.
In everything else, I stand by my original claim. If you want naive
datetime obejcts, you should use naive datetime objects.
That's tautological (if you want X, you should use X). I'm not sure
what you intended to say.
[Lennart Regebro]
\ I have yet to see a use case for that.
[Tim]
Of course you have. When you address them, you usually dismiss them
as calendar operations (IIRC).
'[Lennart]
Those are not usecases for this broken behaviour.
I agree there is a usecase for where you want to add one day to
[Tim]
timedelta objects only store days, seconds, and microseconds,
[Lennart Regebro rege...@gmail.com]
Except that they don't actually store days. They store 24 hour
periods,
Not really. A timedelta is truly an integer number of microseconds,
and that's all. The internal division into
[Łukasz Rekucki lreku...@gmail.com]
Maybe instead of trying to decide who is wrong and which approach is
broken, Python just needs a more clear separation between timezone
aware objects and naive ones?
[Lennart Regebro rege...@gmail.com]
Well, the separation is pretty clear already.
I
[Ronald Oussoren ronaldousso...@mac.com]
IMHO “+ 1 days” and “+ 24 hours” are two different things.
Date arithmetic is full of messy things like that.
But it's a fact that they _are_ the same in naive time, which Python's
datetime single-timezone arithmetic implements:
- A minute is exactly 60
The days attribute here is indeed confusing as it doesn't mean 1 day,
it means 24 hours.
Which, in naive arithmetic, are exactly the same thing.
[Terry Reedy]
I think using the word 'naive' is both inaccurate and a mistake. The issue
is civil or legal time versus STEM time, where the
[Chris Barker]
...
and infact, everything Tim said can also apply to UTC time. We've had a lot
of discussion on teh numpy list about the difference between UTC and naive
times, but for practicle putrposes, they are exactly the same -- unitl you
try to convert to a known time zone anyway.
[Tim]
But it's a fact that they _are_ the same in naive time, which Python's
datetime single-timezone arithmetic implements:
- A minute is exactly 60 seconds.
...
[Chris Angelico ros...@gmail.com]
No leap second support, presumably. Also feature?
Absolutely none, and absolutely a feature,
[Terry Reedy tjre...@udel.edu]
To me, having 1 day be 23 or 25 hours of elapsed time on the DST transition
days, as in Paul's alarm example, hardly ignores the transition point.
It's 2:56PM. What time will it be 24 hours from now? If your answer
is not enough information to say, but it will
[Paul Moore]
...
I think the following statements are true. If they aren't, I'd
appreciate clarification. I'm going to completely ignore leap seconds
in the following - I hope that's OK, I don't understand leap seconds
*at all* and I don't work in any application areas where they are
[Tim]
Python didn't implement timezone-aware arithmetic at all within a
single time zone. Read what I wrote just above. It implements naive
arithmetic within a single time zone.
[Jon Ribbens jon+python-...@unequivocal.co.uk]
This usage of time zone is confusing.
Ha! _All_ usages of time
[Brett Cannon br...@python.org]
\ Alexander and Tim, you okay with moving this conversation to a datetime-sig
if we got one created?
Fine by me!
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
[Tres Seaver tsea...@palladion.com]
Naive alarm clocks (those which don't know from timezones) break human
expectations twice a year, because their users have to be awake to fix
them (or make the clock itself out-of-whack with real civil time for the
hours between fixing and the actual
[Ronald Oussoren]
Treating time as UTC with conversions at the application edge might
be cleaner in some sense, but can make code harder to read for
application domain experts.
It might be nice to have time zone aware datetime objects with the
right(TM) semantics, but those can and should
[Mark Lawrence breamore...@yahoo.co.uk]
To me a day is precisely 24 hours, no more, no less. I have no interest in
messing about with daylight savings of 30 minutes, one hour, two hours or
any other variant that I've not heard about.
In my mission critical code, which I use to predict my
[Paul Moore]
[Tim]
Guido will never allow any aspect of leap seconds into the core,
[Chris Barker chris.bar...@noaa.gov]
really? that is a shame (and odd) -- it's a trick, because we don't know
what leap seconds will be needed in the future, but other than that, it's
not really any
[Paul Moore p.f.mo...@gmail.com]
I think the current naive semantics are useful and should not be
discarded lightly. At an absolute minimum, there should be a clear,
documented way to get the current semantics under any changed
implementation.
Realistically, default arithmetic behavior can't
[Tim]
The Python docs also are quite clear about that all arithmetic within
a single timezone is naive. That was intentional. The _intended_
way to do aware arithmetic was always to convert to UTC, do the
arithmetic, then convert back.
[Lennart]
We can't explicitly implement incorrect
[Paul Moore p.f.mo...@gmail.com]
As an example, consider an alarm clock. I want it to go off at 7am
each morning. I'd feel completely justified in writing
tomorrows_alarm = todays_alarm + timedelta(days=1).
[Lennart Regebro rege...@gmail.com]
That's a calendar operation made with a
[ISAAC J SCHWABACHER ischwabac...@wisc.edu]
...
I think the right perspective is that a time zone *is* the function that its
`fromutc()` method implements,
[Tim]
Fine by me ;-)
[Isaac]
My issue is that you're computing `fromutc()`, which is a function, in
terms of `dst()` and
[Tim]
However, the _body_ of the PEP said nothing whatsoever about altering
arithmetic. The body of the PEP sounds like it's mainly just
proposing to fold the pytz package into the core. Perhaps doing
_just_ that much would get this project unstuck? Hope springs eternal :-)
[Lennart
[Lennart Regebro rege...@gmail.com]
And I would want to remind everyone again that this is not a question
of the problem being impossible. It's just really complex to get right
in all cases, and that always having the UTC timestamp around gets rid
of most of that complexity.
[Tim]
Could you
[ISAAC J SCHWABACHER ischwabac...@wisc.edu]
...
I disagree with the view Tim had of time zones when he wrote that comment
(and that code). It sounds like he views US/Eastern and US/Central as time
zones (which they are), but thinks of the various America/Indiana zones as
switching back and
[Tim]
Sure. But, honestly, who cares? Riyadh Solar Time was so
off-the-wall that even the Saudis gave up on it 25 years ago (after a
miserable 3-year experiment with it). Practicality beats purity.
Heh. It's even sillier than that - the Saudis never used Riyadh
Solar Time, and it's been
[Lennart Regebro rege...@gmail.com]
And I would want to remind everyone again that this is not a question
of the problem being impossible. It's just really complex to get right
in all cases, and that always having the UTC timestamp around gets rid
of most of that complexity.
Could you please
[Tim]
The formulas only produced approximations, and then
rounded to 5-second boundaries because the tz data format didn't have
enough bits.
[ISAAC J SCHWABACHER ischwabac...@wisc.edu]
Little known fact: if you have a sub-minute-resolution UTC offset when a
leap second hits, it rips open a
[Steven D'Aprano]
>> ...
>> I think it is fair to say that out of the three functions, there is
>> consensus that randbelow has the most useful functionality in a crypto
>> context. Otherwise, people seem roughly equally split between the three
>> functions. There doesn't seem to be any use-case
[Chris Angelico ]
> What I'd like to hear (but maybe this won't be possible) would be
> "less-than is transitive if and only if ", where might be
> something like "all of the datetimes are in the same timezone" or
> "none of the datetimes fall within a fold" or something. That
[Random832 ]
> ...
>
> Also, can someone explain why this:
> >>> ET = pytz.timezone("America/New_York")
> >>> datetime.strftime(datetime.now(ET) + timedelta(days=90),
> ... "%Y%m%d %H%M%S %Z %z")
> returns '20151210 214526 EDT -0400'
pytz lives in its own
[Nick Coghlan ]
> ...
> Sorry, what I wrote in the code wasn't what I wrote in the text, but I
> didn't notice until Guido pointed out the discrepancy. To get the
> right universal invariant, I should have normalised the LHS, not the
> RHS:
>
>
[Guido]
>>> I think we should change this in the PEP, except I can't find where
>>> the PEP says == should raise an exception in this case.
[Tim]
>> It doesn't - the only comparison behavior changed by the PEP is in
>> case of interzone comparison when at least one comparand is a "problem
>>
[Tim]
>>> ...
>>> The
>>> top-level operation on the RHS is datetime.fromtimestamp(). However,
>>> it didn't pass a tzinfo, so it creates a naive datetime. Assuming dt
>>> was aware to begin with, the attempt to compare will always (gap or
>>> not) raise an exception.
[Tim]
>> Oops! In current
[Tim]
>> Sure - no complaint. I was just saying that in the specific,
>> complicated, contrived expression Nick presented, that it always
>> returns False (no matter which aware datetime he starts with) would be
>> more of a head-scratcher than if it raised a "can't compare naive and
>> aware
[Guido]
>> it is broken, due to the confusion about classic vs. timeline arithmetic
>> -- these have different needs but there's only one > operator.
[Alex]
> I feel silly trying to defend a design against its author. :-)
"Design" may be an overstatement in this specific case ;-)
I remember
[Nick Coghlan]
>>> Based on the UTC/local diagram from the "Mind the Gap" section, am I
>>> correct in thinking that the modified invariant that also covers times
>>> in a gap is:
>>>
>>> dt ==
>>> datetime.fromtimestamp(dt.astimezone(utc).astimezone(dt.tzinfo).timestamp())
>>>
>>> That is,
[Tim]
> ...
> The
> top-level operation on the RHS is datetime.fromtimestamp(). However,
> it didn't pass a tzinfo, so it creates a naive datetime. Assuming dt
> was aware to begin with, the attempt to compare will always (gap or
> not) raise an exception.
Oops! In current Python, comparing
[Alexander Belopolsky]
>> ...
>> but I would really like to see a change in the repr of negative
>> timedeltas:
>>
>> >>> timedelta(minutes=-1)
>> datetime.timedelta(-1, 86340)
>>
>> And str() is not much better:
>>
>> >>> print(timedelta(minutes=-1))
>> -1 day, 23:59:00
>>
>> The above does not
[Tim]
>> But I wouldn't change repr() - the internal representation is fully
>> documented, and it's appropriate for repr() to reflect documented
>> internals as directly as possible.
[Alex]
> Note that in the case of float repr, the consideration of user convenience
> did win over "reflect
[David Mertz]
> OK. My understanding is that Guido ruled out introducing an os.getrandom()
> API in 3.5.2. But would you be happy if that interface is added to 3.6?
>
> It feels to me like the correct spelling in 3.6 should probably be
> secrets.getrandom() or something related to that.
[Random832]
> So, I have a question. If this "weakness" in /dev/urandom is so
> unimportant to 99% of situations... why isn't there a flag that can be
> passed to getrandom() to allow the same behavior?
Isn't that precisely the purpose of the GRND_NONBLOCK flag?
[Tim]
>> secrets.token_bytes() is already the way to spell "get a string of
>> messed-up bytes", and that's the dead obvious (according to me) place
>> to add the potentially blocking implementation.
[Sebastian Krause]
> I honestly didn't think that this was the dead obvious function to
> use. To
[Nikolaus Rath]
>> Aeh, what the tin says is "return random bytes".
[Larry Hastings]
> What the tin says is "urandom", which has local man pages that dictate
> exactly how it behaves. On Linux the "urandom" man page says:
>
> A read from the /dev/urandom device will not block waiting for
[Guido]
> ...
> An alternative would be to keep the secrets module linked to SystemRandom,
> and improve the latter. Its link with os.random() is AFAIK undocumented. Its
> API is clumsy but for code that needs some form of secret-ish bytes and
> requires platform and Python version independence it
[Sebastian Krause]
> ...
> Ideally I would only want to use the random module for
> non-secure and (in 3.6) the secrets module (which could block) for
> secure random data and never bother with os.urandom (and knowing how
> it behaves). But then those modules should probably get new
> functions to
[Greg Ewing ]
> The Mersenne Twister is no longer regarded as quite state-of-the art
> because it can get into states that produce long sequences that are
> not very random.
>
> There is a variation on MT called WELL that has better properties
> in this regard. Does
Brett Cannon ]
>> And if we didn't keep its count accurately it would eventually hit
>> zero and constantly have its dealloc function checked for.
[Armin Rigo]
[> I think the idea is really consistency. If we wanted to avoid all
> "Py_INCREF(Py_None);", it would be possible: we
[Facundo Batista ]
> I'm seeing that our code increases the reference counting to Py_None,
> and I find this a little strange: isn't Py_None eternal and will never
> die?
Yes, but it's immortal in CPython because its reference count never
falls to 0 (it's created with a
[Brett Cannon ]
> Can someone disable this person's subscription?
Done.
> On Mon, 25 Apr 2016 at 14:15 Kenny via Python-Dev
> wrote:
>>
>>
>> fopen Terminal.app.python.
>> 3.5.0.()
>>
>> def fopen Termina.app.python.3.5.0.()
>>
>>
[Tim Golden , on Kenny the "thingy" guy]
> Not subscribed; probably via gmane.
They were subscribed, but I already did the unsub.
> I've added him to a hold list via spam filter. See if that works.
So now we're doubly safe ;-)
You may be interested in this seemingly related bug report:
http://bugs.python.org/issue26601
[Neil Schemenauer ]
> I was running Python 2.4.11 under strace and I noticed some odd
> looking system calls:
>
> mmap(NULL, 262144, PROT_READ|PROT_WRITE, MAP_PRIVATE|MAP_ANONYMOUS,
[Guido]
> After a fruitful discussion on python-ideas I've decided that it's fine to
> break lines *before* a binary operator. It looks better and Knuth recommends
> it.
> ...
> Therefore it is permissible to break before or
> after a binary operator, as long as the convention is consistent
>
[redirected from python-dev, to python-ideas;
please send followups only to python-ideas]
[Elliot Gorokhovsky ]
> ...
> TL;DR: Should I spend time making list.sort() detect if it is sorting
> lexicographically and, if so, use a much faster algorithm?
It will be fun to
[Terry Reedy <tjre...@udel.edu>]
>> Tim Peters investigated and empirically determined that an
>> O(n*n) binary insort, as he optimized it on real machines, is faster
>> than O(n*logn) sorting for up to around 64 items.
[Nikolaus Rath <nikol...@rath.org>]
> Out of
[Guido]
> Wouldn't attempting to reuse DUMMY entries be expensive? You'd have to
> search forward in the array. Just keeping a count of DUMMY entries and
> compacting when there are too many seems better somehow.
I haven't looked at the code, but presumably one of the members of a
DUMMY key/value
[please restrict follow-ups to python-ideas]
Let's not get hung up on meta-discussion here - I always thought "massive
clusterf**k" was a precise technical term anyway ;-)
While timing certainly needs to be done more carefully, it's obvious to me
that this approach _should_ pay off significantly
[Terry Reedy]
>
> This seems like a generic issue with timing mutation methods
> ...
> But I am sure Tim worked this out in his test code, which should be
> reused, perhaps updated with Viktor's perf module to get the most
> stable timings possible.
sortperf.py is older than me ;-) It's not at
[Giampaolo Rodola' ]
>
> To be entirely honest, I'm not even sure why they need to be forcefully
> declared upfront in the first place, instead of just having a first-class
> function (builtin?) written in C:
>
> >>> ntuple(x=1, y=0)
> (x=1, y=0)
>
> ...or even a literal
[Max Moroz ]
> What would be the disadvantage of implementing collections.deque as a
> circular array (rather than a doubly linked list of blocks)? ...
You answered it yourself ;-)
> ...
> Of course when the circular array is full, it will need to be reallocated,
> but the
[INADA Naoki ]
> ...
> Since current pool size is 4KB and there is pool_header in pool,
> we can't allocate 4KB block from pool.
> And if support 1KB block, only 3KB of 4KB can be actually used.
> I think 512 bytes / 4KB (1/8) is good ratio.
>
> Do you mean increase pool
[Tim]
>> While I would like to increase the pool size, it's fraught with
>> danger.
[Antoine Pitrou ]
> What would be the point of increasing the pool size? Apart from being
> able to allocate 4KB objects out of it, I mean.
>
> Since 4KB+ objects are relatively uncommon (I
[Larry Hastings ]
> ...
> Yet CPython's memory consumption continues to grow. By the time a current
> "trunk" build of CPython reaches the REPL prompt it's already allocated 16
> arenas.
I'd be surprised if that's true ;-) The first time `new_arena()` is
called, it allocates
For fun, let's multiply everything by 256:
- A "pool" becomes 1 MB.
- An "arena" becomes 64 MB.
As briefly suggested before, then for any given size class a pool
could pass out hundreds of times more objects before needing to fall
back on the slower code creating new pools or new arenas.
As an
[Tim]
>> ... That is, it's up to the bit vector implementation
>> to intelligently represent what's almost always going to be a
>> relatively tiny slice of a theoretically massive address space.
[Antoine]
> True. That works if the operating system doesn't go too wild in
> address space
[Tim]
>> A virtual address space span of a terabyte could hold 1M pools, so
>> would "only" need a 1M/8 = 128KB bit vector. That's minor compared to
>> a terabyte (one bit per megabyte).
[Antoine]
> The virtual address space currently supported by x86-64 is 48 bits
> wide (spanning an address
[Larry]
> ...
> Oh! I thought it also allocated the arenas themselves, in a loop. I
> thought I saw that somewhere. Happy to be proved wrong...
There is a loop in `new_arena()`, but it doesn't do what a casual
glance may assume it's doing ;-) It's actually looping over the
newly-allocated
[Richard Hinerfeld ]
> Compiling Python-3.6.3 on Linux fails two tests: test_math and test_cmatg
Precisely which version of Linux? The same failure has already been
reported on OpenBSD here:
https://bugs.python.org/issue31630
[Tim]
>> In that case, it's because Python
>> _does_ mutate the objects' refcount members under the covers, and so
>> the OS ends up making fresh copies of the memory anyway.
[Greg Ewing ]
> Has anyone ever considered addressing that by moving the
> refcounts out of
[Neil Schemenauer ]
> Python objects that participate in cyclic GC (things like lists, dicts,
> sets but not strings, ints and floats) have extra memory overhead. I
> think it is possible to mostly eliminate this overhead. Also, while
> the GC is running, this GC state is
[Eric Snow ]
> Does that include preserving order after deletion?
Given that we're blessing current behavior:
- At any moment, iteration order is from oldest to newest. So, "yes"
to your question.
- While iteration starts with the oldest, .popitem() returns the
[Peter Ludemann]
> Does it matter whether the dict order after pop/delete is explicitly
> specified, or just specified that it's deterministic?
Any behavior whatsoever becomes essential after it becomes known ;-)
For example, dicts as currently ordered easily support LRU (least
recently used)
[Chris Angelico ...
> With current semantics, you can easily prove that a list comp is
> implemented with a function by looking at how it interacts with other
> scopes (mainly class scope), but Tim's proposal may change that.
Absolutely not. I haven't considered for a
601 - 700 of 962 matches
Mail list logo