Bug Report
https://github.com/mdn/content/issues/8036
___
Python-ideas mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived
Matsuoka Takuo :
>
> Now, is "1,2," more boxed up than "*(1,2)," is? The *current* rule
> surely says the former is a tuple at some places and the latter
> is not,
Actually, this was wrong. First of all,
>>> *(1,2),
(1, 2)
Moreover, while the Language Reference says
return_stmt ::= "return"
Hello,
before posting to python-dev I thought is is the best to discuss this
here. And I assume that someone else had the same idea then me before.
Maybe you can point me to the relevant discussion/ticket.
I read about Intels hybrid CPUs. It means there are multiple cores e.g.
8 high-speed c
The error mentioned in the title is this.
>>> *()
File "", line 1
SyntaxError: can't use starred expression here
According to the Language Reference
https://docs.python.org/3/reference/expressions.html#expression-lists
it's not really a starred expression. In the context of defining the
notio
Dear Guido van Rossum,
Thank you for bringing the PEP's to my attention.
The idea of PEP 637 on a[*x] is different from my idea. The PEP's idea
appears making subscription analogous to function calls. In the end,
a[*x] would have been equivalent to
a[tuple(x)]
if the PEP had been adopted. a[*
So,
It is out of scope of Pythonmultiprocessing, and, as I perceive it, from
the stdlib as a whole to be able to allocate specific cores for each
subprocess -
that is automatically done by the O.S. (and of course, the O.S. having an
interface
for it, one can write a specific Python library which wo
Dear Joan,
Am 18.08.2021 14:36 schrieb Joao S. O. Bueno:
As it stands however, is that you simply have to change your approach:
instead of dividing yoru workload into different cores before starting,
the
common approach there is to set up worker processes, one per core, or
per processor thread,
On Wed, Aug 18, 2021 at 10:37 PM Joao S. O. Bueno wrote:
>
> So,
> It is out of scope of Pythonmultiprocessing, and, as I perceive it, from
> the stdlib as a whole to be able to allocate specific cores for each
> subprocess -
> that is automatically done by the O.S. (and of course, the O.S. havin
On 18.08.2021 15:58, Chris Angelico wrote:
> On Wed, Aug 18, 2021 at 10:37 PM Joao S. O. Bueno
> wrote:
>>
>> So,
>> It is out of scope of Pythonmultiprocessing, and, as I perceive it, from
>> the stdlib as a whole to be able to allocate specific cores for each
>> subprocess -
>> that is automat
On Thu, Aug 19, 2021 at 12:52 AM Marc-Andre Lemburg wrote:
>
> On 18.08.2021 15:58, Chris Angelico wrote:
> > On Wed, Aug 18, 2021 at 10:37 PM Joao S. O. Bueno
> > wrote:
> >>
> >> So,
> >> It is out of scope of Pythonmultiprocessing, and, as I perceive it, from
> >> the stdlib as a whole to be
> On 18 Aug 2021, at 16:03, Chris Angelico wrote:
>
> On Thu, Aug 19, 2021 at 12:52 AM Marc-Andre Lemburg wrote:
>>
>>> On 18.08.2021 15:58, Chris Angelico wrote:
>>> On Wed, Aug 18, 2021 at 10:37 PM Joao S. O. Bueno
>>> wrote:
So,
It is out of scope of Pythonmultiprocessin
The worker pool approach is probably the way to go, but there is a fair bit
of overhead to creating a multiprocessing job. So fewer, larger jobs are
faster than many small jobs.
So you do want to make the jobs as large as you can without wasting CPU
time.
-CHB
On Wed, Aug 18, 2021 at 9:09 AM Bar
Let's imagine I have an algorithm which depends on a context variable.
I write an algorithm (elided below for space) which depends on it. Then I
realize that I can improve the performance of my algorithm by using
concurrent.futures. But my algorithm will change its behaviour because it does
not
Jack DeVries wrote:
> Hi All!
> We are trying to replace a link in the official docs which is now
> broken, but used to link to this article:
> https://web.archive.org/web/20210613191914/https://developer.mozilla.org/en-...
> Can you offer a suggestion for a replacement? The bad link has already
>
On 18Aug2021 16:30, Paul Prescod wrote:
>Let's imagine I have an algorithm which depends on a context variable.
>
>I write an algorithm (elided below for space) which depends on it. Then I
>realize that I can improve the performance of my algorithm by using
>concurrent.futures. But my algorithm
Christopher Barker writes:
> The worker pool approach is probably the way to go, but there is a fair bit
> of overhead to creating a multiprocessing job. So fewer, larger jobs are
> faster than many small jobs.
True, but processing those rows would have to be awfully fast for the
increase in o
Would a work stealing approach work better for you here? Then the only
signalling overhead would be when a core runs out of work
On Thu, 19 Aug 2021, 05:36 Stephen J. Turnbull, <
[email protected]> wrote:
> Christopher Barker writes:
>
> > The worker pool approach is probably t
17 matches
Mail list logo