[Python-Dev] Re: Virtual machine bleeds into generator implementation?

2020-04-26 Thread Guido van Rossum
It looks it is using this to save a separate, explicit bit that says "we
left this frame by yielding rather than by returning". Note that f_stacktop
is cleared when the frame is re-entered:

stack_pointer = f->f_stacktop;
assert(stack_pointer != NULL);
f->f_stacktop = NULL;   /* remains NULL unless yield suspends frame
*/
f->f_executing = 1;

On Sun, Apr 26, 2020 at 7:57 PM Skip Montanaro 
wrote:

> This is more an observation and question than anything else, but
> perhaps it will stimulate some ideas from the experts. Consider this
> trivial generator function:
>
> def gen(a):
> yield a
>
> When the YIELD_VALUE instruction is executed, it executes (in the
> non-async case):
>
> retval = POP();
> f->f_stacktop = stack_pointer;
> goto exiting;
>
> This is fine as far as it goes. However, execution eventually leads to
> Objects/genobject.c where we hit this code (I think after falling off
> the YIELD_VALUE instruction, but perhaps virtual machine execution
> reaches RETURN_VALUE):
>
> /* If the generator just returned (as opposed to yielding), signal
>  * that the generator is exhausted. */
> if (result && f->f_stacktop == NULL) {
>
> There are several other references to f->f_stacktop in genobject.c.
> I've not yet investigated all of them.
>
> As I'm working on a register-based virtual machine implementation, I
> don't fiddle with the stack at all, so it's a bit problematic that the
> generator implementation is so intimate with the stack. As this is an
> area of the core which is completely new to me, I wonder if someone
> can suggest alternate ways of achieving the same effect without
> relying on the state of the stack. It seems to me that from within
> PyEval_EvalFrameDefault the implementations of relevant instructions
> could reference the generator object via f->f_gen and call some (new?)
> private API on generators which toggles the relevant bit of state in
> the generator.
>
> I think it's worse that this though, as it seems that in gen_send_ex()
> it actually pushes a value onto the stack. That can't be solved by
> simply adding a state attribute to the generator object struct.
>
> Skip
> ___
> Python-Dev mailing list -- python-dev@python.org
> To unsubscribe send an email to python-dev-le...@python.org
> https://mail.python.org/mailman3/lists/python-dev.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-dev@python.org/message/Q7JIWXV7O5FCA4A4TVF4RGOMAAA5EJRO/
> Code of Conduct: http://python.org/psf/codeofconduct/
>


-- 
--Guido van Rossum (python.org/~guido)
*Pronouns: he/him **(why is my pronoun here?)*

___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/2V4NXHDQUJFLFMVA777R6FMIWPYXRAWA/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Virtual machine bleeds into generator implementation?

2020-04-26 Thread Skip Montanaro
This is more an observation and question than anything else, but
perhaps it will stimulate some ideas from the experts. Consider this
trivial generator function:

def gen(a):
yield a

When the YIELD_VALUE instruction is executed, it executes (in the
non-async case):

retval = POP();
f->f_stacktop = stack_pointer;
goto exiting;

This is fine as far as it goes. However, execution eventually leads to
Objects/genobject.c where we hit this code (I think after falling off
the YIELD_VALUE instruction, but perhaps virtual machine execution
reaches RETURN_VALUE):

/* If the generator just returned (as opposed to yielding), signal
 * that the generator is exhausted. */
if (result && f->f_stacktop == NULL) {

There are several other references to f->f_stacktop in genobject.c.
I've not yet investigated all of them.

As I'm working on a register-based virtual machine implementation, I
don't fiddle with the stack at all, so it's a bit problematic that the
generator implementation is so intimate with the stack. As this is an
area of the core which is completely new to me, I wonder if someone
can suggest alternate ways of achieving the same effect without
relying on the state of the stack. It seems to me that from within
PyEval_EvalFrameDefault the implementations of relevant instructions
could reference the generator object via f->f_gen and call some (new?)
private API on generators which toggles the relevant bit of state in
the generator.

I think it's worse that this though, as it seems that in gen_send_ex()
it actually pushes a value onto the stack. That can't be solved by
simply adding a state attribute to the generator object struct.

Skip
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/Q7JIWXV7O5FCA4A4TVF4RGOMAAA5EJRO/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 554 for 3.9 or 3.10?

2020-04-26 Thread Carl Shapiro
On Tue, Apr 21, 2020 at 7:33 AM Victor Stinner  wrote:
> IMHO it's worth it to explore the subinterpreters approach. I don't
> think that it's going to be a major maintenance burden: the code is
> already written and tests. The PEP is only about adding a public
> Python API.

While this PEP may not create a maintenance burden for CPython, it
does have the effect of raising the complexity bar for an alternative
Python implementation.

> Even today, asyncio didn't replace threads, multiprocessing,
> concurrent.futures, etc. There are even competitor projects like
> Twisted, trio and curio! (Also eventlet and gevent based on greenlet
> which is a different approach). I only started to see very recently
> project like httpx which supports both blocking and asynchronous API.

Because asyncio had been implemented as a library, the up-take of
asyncio could have been lower because the demand was fulfilled, at
least in part, by those third-party libraries?

A thought that may have already been mentioned elsewhere: perhaps the
PEP could be more made more acceptable by de-scoping it to expose a
minimal set of C-API hooks to enable third-party libraries for the
sub-interpreter feature rather than providing that feature in the
standard library?
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/KSRMK2OULUHM4KTKAFXWO6RFM2F2IWE6/
Code of Conduct: http://python.org/psf/codeofconduct/