[Python-Dev] Re: Draft PEP: Remove wstr from Unicode

2020-06-22 Thread Inada Naoki
On Tue, Jun 23, 2020 at 6:58 AM Victor Stinner  wrote:
>
> Hi INADA-san,
>
> First of all, thanks for writing down a PEP!
>
> Le jeu. 18 juin 2020 à 11:42, Inada Naoki  a écrit :
> > To support legacy Unicode object created by
> > ``PyUnicode_FromUnicode(NULL, length)``, many Unicode APIs has
> > ``PyUnicode_READY()`` check.
>
> I don't see PyUnicode_READY() removal in the specification section.
> When can we remove these calls and the function itself?
>

Legacy unicode representation is using wstr so legacy unicode support
is removed with wstr.
PyUnicode_READY() will be no-op when wstr is removed.  We can remove
calling of PyUnicode_READY() since then.

I think we can deprecate PyUnicode_READY() when wstr is removed.

>
> > Support of legacy Unicode object makes Unicode implementation complex.
> > Until we drop legacy Unicode object, it is very hard to try other Unicode
> > implementation like UTF-8 based implementation in PyPy.
>
> I'm not sure if it should be in the scope of the PEP or not, but there
> are also other C API functions which are too close to the PEP 393
> concrete implementation. For example, I'm not sure that
> PyUnicode_MAX_CHAR_VALUE(str) would be relevant/efficient if Python
> str is reimplemented to use UTF-8 internally. Should we deprecate it
> as well? Do you think that it should be addressed in a separated PEP?
>

I don't like optimizations which is heavily relying on CPython
implementation. But I think it is too early to deprecate it.
We should just recommend UTF-8 based approach.


> In fact, a large part of the Unicode C API is based on the current
> implementation of the Python str type. For example, I'm not sure that
> PyUnicode_New(size, max_char) would still make sense if we change the
> code to store strings as UTF-8 internally.
>
> In an ideal world, I would prefer to have a "string builder" API, like
> the current _PyUnicodeWriter C API, to create a string, and only never
> allow to modify a string in-place.

I completely agree with you.  But current _PyUnicodeWriter is tight
coupled with PEP 393 and it is not UTF-8 based.  I am not sure that
we should make it public and stable from Python 3.10.

I think we should recommend `PyUnicode_FromStringAndSize(utf8, utf8_len)`
for now to avoid too tightly coupled with PEP 393.

Regards,

-- 
Inada Naoki  
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/3MT3ZHA66PW7K7OLZERTDLFQEDFPYHQI/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 620: Hide implementation details from the C API

2020-06-22 Thread Neil Schemenauer
Hi Victor,

Thanks for putting work into this.  I support the idea of slowly
evolving the C API.  It must be done carefully so as to not
unnecessarily break 3rd party extensions.  Changes must be made for
well founded reasons and not just because we think it makes a
"cleaner" API.  I believe you are following those principles.

One aspect of the API that could be improved is memory management
for PyObjects.  The current API is quite a mess and for no good
reason except legacy, IMHO.  The original API design allowed
extension types to use their own memory allocator.  E.g. they could
call their own malloc()/free() implemention and the rest of the
CPython runtime would handle that.  One consequence is that
Py_DECREF() cannot call PyObject_Free() but instead has to call
tp_dealloc().  There was supposed to be multiple layers of
allocators, PyMem vs PyObject, but since the layering was not
enforced, we ended up with a bunch of aliases to the same underlying
function.

Perhaps there are a few cases when the flexibility to use a custom
object allocator is useful.  I think in practice it is very rare
than an extension needs to manage memory itself.  To achieve
something similar, allow a PyObject to have a reference to some
externally managed resource and then the tp_del method would take
care of freeing it.  IMHO, the Python runtime should be in charge of
allocating and freeing PyObject memory.

I believe fixing this issue is not tricky, just tedious.  The
biggest hurdle might be dealing with statically allocated objects.
IMHO, they should go away and there should only be heap allocated
PyObjects (created and freed by calling CPython API functions).
That change would affect most extensions, unfortunately.

Another place for improvement is that the C API is unnecessarily
large.  E.g. we don't really need PyList_GetItem(),
PyTuple_GetItem(), and PyObject_GetItem().  Every extra API is a
potential leak of implementation details and a burden for
alternative VMs.  Maybe we should introduce something like
WIN32_LEAN_AND_MEAN that hides all the extra stuff.  The
Py_LIMITED_API define doesn't really mean the same thing since it
tries to give ABI compatibility.  It would make sense to cooperate
with the HPy project on deciding what parts are unnecessary.  Things
like Cython might still want to use the larger API, to extract every
bit of performance.  The vast majority of C extensions don't require
that.

One final comment: I think even if we manage to cleanup the API and
make it friendly for other Python implementations, there is going to
be a fair amount of overhead.  If you look at other "managed
runtimes" that just seems unavoidable (e.g. Java, CLR, V8, etc).
You want to design the API so that you maximize the amount of useful
work done with each API call.  Using something like
PyList_GET_ITEM() to iterate over a list is not a good pattern.  So
keep in mind that an extension API is going to have some overhead.


Regards,

  Neil
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/EH5DXCR4QTFLOVJTQWSJ6QBK6HS7Y65U/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Draft PEP: Remove wstr from Unicode

2020-06-22 Thread Victor Stinner
Hi INADA-san,

First of all, thanks for writing down a PEP!

Le jeu. 18 juin 2020 à 11:42, Inada Naoki  a écrit :
> To support legacy Unicode object created by
> ``PyUnicode_FromUnicode(NULL, length)``, many Unicode APIs has
> ``PyUnicode_READY()`` check.

I don't see PyUnicode_READY() removal in the specification section.
When can we remove these calls and the function itself?


> Support of legacy Unicode object makes Unicode implementation complex.
> Until we drop legacy Unicode object, it is very hard to try other Unicode
> implementation like UTF-8 based implementation in PyPy.

I'm not sure if it should be in the scope of the PEP or not, but there
are also other C API functions which are too close to the PEP 393
concrete implementation. For example, I'm not sure that
PyUnicode_MAX_CHAR_VALUE(str) would be relevant/efficient if Python
str is reimplemented to use UTF-8 internally. Should we deprecate it
as well? Do you think that it should be addressed in a separated PEP?

In fact, a large part of the Unicode C API is based on the current
implementation of the Python str type. For example, I'm not sure that
PyUnicode_New(size, max_char) would still make sense if we change the
code to store strings as UTF-8 internally.

In an ideal world, I would prefer to have a "string builder" API, like
the current _PyUnicodeWriter C API, to create a string, and only never
allow to modify a string in-place.

CPython "almost" immutable str "if reference count is equal to 1" has
corner cases and can be misused. But again, I don't think that it
should be part of this PEP :-) Sorry for being off-topic ;-)

> Specification
> =
>
> Affected APIs
> --
>
> From the Unicode implementation, ``wstr`` and ``wstr_length`` members are
> removed.
>
> Macros and functions to be removed:
>
> * PyUnicode_GET_SIZE
> * PyUnicode_GET_DATA_SIZE
> * Py_UNICODE_WSTR_LENGTH
> * PyUnicode_AS_UNICODE
> * PyUnicode_AS_DATA
> * PyUnicode_AsUnicode
> * PyUnicode_AsUnicodeAndSize

Which ones are already deprecated?

> Behaviors to be removed:
>
> * PyUnicode_FromUnicode -- ``PyUnicode_FromUnicode(NULL, size)`` where
>   ``size > 0`` cause RuntimeError instead of creating legacy Unicode
>   object. While this API is deprecated by PEP 393, this API will be kept
>   when ``wstr`` is removed. This API will be removed later.

I'm not sure that it's relevant to keep PyUnicode_FromUnicode()
whereas PyUnicode_FromWideChar() has a clean API (use wchar_t*, not
Py_UNICODE*). I also suggest to disallow PyUnicode_FromUnicode(NULL,
0) as well.

By the way, when can we finally remove the Py_UNICODE type?

I would prefer to remove Py_UNICODE and PyUnicode_FromUnicode().


> * PyUnicode_FromStringAndSize -- Like PyUnicode_FromUnicode,
>   ``PyUnicode_FromStringAndSize(NULL, size)`` cause RuntimeError
>   instead of creating legacy unicode object.



> All APIs to be changed should raise DeprecationWarning for behavior to be
> removed. Note that ``PyUnicode_FromUnicode`` has both of compiler deprecation
> warning and runtime DeprecationWarning. [3]_, [4]_.

Every function scheduled for removal? Even PyUnicode_GET_SIZE()? I'm
not sure that C extensions are prepared for PyUnicode_GET_SIZE()
raising an exception when using -Werror.


> All deprecations will be implemented in Python 3.10.
> Some deprecations will be backported in Python 3.9.
>
> Actual removal will happen in Python 3.12.

Many functions are already declared with Py_DEPRECATED() for a long
time. Would it make sense to remove these functions earlier?


Victor
-- 
Night gathers, and now my watch begins. It shall not end until my death.
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/M7JFI5TWLM7KOYVSBFFTPQS5HHO4DF2M/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: os.add_dll_directory and DLL search order

2020-06-22 Thread Eryk Sun
On 6/22/20, Steve Dower  wrote:
>
> What is likely happening here is that _sqlite3.pyd is being imported
> before _mapscript, and so there is already a SQLITE3 module in memory.
> Like Python, Windows will not attempt to import a second module with the
> same name, but will return the original one.

Qualified DLL loads won't interfere with each other, but dependent
DLLs are loaded by base name only. In these cases a SxS assembly
allows loading multiple DLLs that have the same base name. If the
assembly is referenced by a DLL, embed the manifest in the DLL as
resource 2. For example:

>>> import ctypes
>>> test1 = ctypes.CDLL('./test1')
>>> test2 = ctypes.CDLL('./test2')
>>> test1.call_spam.restype = None
>>> test2.call_spam.restype = None

>>> test1.call_spam()
spam v1.0
>>> test2.call_spam()
spam v2.0

>>> import win32process, win32api
>>> names = [win32api.GetModuleFileName(x)
... for x in win32process.EnumProcessModules(-1)]
>>> spams = [x for x in names if 'spam' in x]
>>> print(*spams, sep='\n')
C:\Temp\test\c\spam.dll
C:\Temp\test\c\spam_assembly\spam.dll

Source

spam1.c (spam.dll):

#include 

void __declspec(dllexport) spam()
{
printf("spam v1.0\n");
}


test1.c (test1.dll):

#pragma comment(lib, "spam")
void __declspec(dllimport) spam();

void __declspec(dllexport) call_spam()
{
spam();
}

---

spam_assembly/spam_assembly.manifest:








spam2.c (spam_assembly/spam.dll):

#include 

void __declspec(dllexport) spam()
{
printf("spam v2.0\n");
}


test2.c (test2.dll -- link with /manifest:embed,id=2):

#pragma comment(lib, "spam")
#pragma comment(linker, "/manifestdependency:\"\
type='win32' \
name='spam_assembly' \
version='2.0.0.0' \
processorArchitecture='amd64' \"")

void __declspec(dllimport) spam();

void __declspec(dllexport) call_spam()
{
spam();
}
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/5PDVL7KOBCCIVRSYQH4WXHBCZ23KYKG3/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: os.add_dll_directory and DLL search order

2020-06-22 Thread Emily Bowman
It's still a problem, even if it's a problem in the opposite direction than
you first thought (Python has a newer sqlite, rather than older). Updating
your API fixes the problem now, but you still need to decide how you check
for and handle newer, potentially incompatible library versions.

On Mon, Jun 22, 2020 at 1:13 PM Seth G  wrote:

> Thanks Ned.
> I did double check the docs for sqlite3 after posting and wondering why
> the versions were so different.
> I guess the clue should have been the sqlite-3 !
> Reading the history of the module I presume sqlite3 has its own module
> version number as it was integrated from a separate project.
>
> Seth
>
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/NYFXECAXMGZM73DSB5AF74O5FV4TFXOL/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: os.add_dll_directory and DLL search order

2020-06-22 Thread Seth G
Thanks Ned. 
I did double check the docs for sqlite3 after posting and wondering why the 
versions were so different. 
I guess the clue should have been the sqlite-3 !
Reading the history of the module I presume sqlite3 has its own module version 
number as it was integrated from a separate project. 

Seth

--
web:http://geographika.co.uk
twitter: @geographika

On Mon, Jun 22, 2020, at 6:41 PM, Ned Deily wrote:
> On Jun 22, 2020, at 05:39, Seth G  wrote:
> > However, there is one feature of using the Windows PATH that I can't seem 
> > to replicate with add_dll_directory.
> > The MapServer DLL builds are linked to sqlite3 3.24.0, whereas Python 3.8.2 
> > is linked to 2.6.0. Building with matching versions is not something I can 
> > easily change.
> 
> For what it's worth, you're looking at the wrong value (it's easy to 
> do!). '2.6.0' is the version number of the sqlite3 module itself, not 
> of the sqlite3 library.
> 
> 
> >>> sqlite3.version
> '2.6.0'
> >>> sqlite3.sqlite_version
> '3.31.1'
> 
> --
>   Ned Deily
>   n...@python.org -- []
> 
>
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/GMW6VL6MY5CUNAMPGQHSI6JB3PL2KLK7/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: os.add_dll_directory and DLL search order

2020-06-22 Thread Seth G
Thanks Steve for the clarifications. 
As a workaround placing the older sqlite3.dll in the same folder as the 
_mapscript.pyd file while leaving all the other DLLs in a different folder 
referenced by add_dll_directory allows for a successful import. 
For a less hacky fix - after checking the version numbers for sqlite3 again 
they are not too different so it may be easiest to update the upstream builds 
to a matching version. 
Apologies for the noise - I thought this was related to the new 
add_dll_directory function, but I can break old versions using PATH and 
mismatched sqlite3 .dlls too!

Seth

--
web:http://geographika.co.uk
twitter: @geographika

On Mon, Jun 22, 2020, at 5:46 PM, Steve Dower wrote:
> On 22Jun2020 1039, Seth G wrote:
> > However, there is one feature of using the Windows PATH that I can't seem 
> > to replicate with add_dll_directory.
> > The MapServer DLL builds are linked to sqlite3 3.24.0, whereas Python 3.8.2 
> > is linked to 2.6.0. Building with matching versions is not something I can 
> > easily change.
> > 
> > When using Windows PATH the folder with the newer version could be added to 
> > the front of the list to take priority and import worked correctly. This 
> > does not seem to be possible with add_dll_directory - the Python 
> > sqlite3.dll in C:\Python38\DLLs always takes priority leading to:
> > 
> > ImportError: DLL load failed while importing _mapscript: The specified 
> > procedure could not be found.
> > 
> > I presume I can't remove the C:\Python38\DLLs path, is there another 
> > solution to this issue?
> 
> DLLs should not be in the search path at all - it's searched by sys.path 
> when importing .pyd files, which are loaded by absolute path and their 
> dependencies found adjacent.
> 
> What is likely happening here is that _sqlite3.pyd is being imported 
> before _mapscript, and so there is already a SQLITE3 module in memory. 
> Like Python, Windows will not attempt to import a second module with the 
> same name, but will return the original one.
> 
> So your best option here is probably to rebuild sqlite3.dll with as much 
> of the version number (or some unique string) in the name as you need. 
> Or you can statically link it into your extension module, assuming you 
> aren't relying on shared state with other modules in your package. The 
> latter is probably easier.
> 
> Cheers,
> Steve
>
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/SO37EWIVDOKEIQO6MPSR2EVL5EDPLV4J/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: os.add_dll_directory and DLL search order

2020-06-22 Thread Ned Deily
On Jun 22, 2020, at 05:39, Seth G  wrote:
> However, there is one feature of using the Windows PATH that I can't seem to 
> replicate with add_dll_directory.
> The MapServer DLL builds are linked to sqlite3 3.24.0, whereas Python 3.8.2 
> is linked to 2.6.0. Building with matching versions is not something I can 
> easily change.

For what it's worth, you're looking at the wrong value (it's easy to do!). 
'2.6.0' is the version number of the sqlite3 module itself, not of the sqlite3 
library.


>>> sqlite3.version
'2.6.0'
>>> sqlite3.sqlite_version
'3.31.1'

--
  Ned Deily
  n...@python.org -- []
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/WILPNTOKCGWLNEPWCQXO47A444XL6USJ/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: os.add_dll_directory and DLL search order

2020-06-22 Thread Steve Dower

On 22Jun2020 1646, Steve Dower wrote:
DLLs should not be in the search path at all - it's searched by sys.path 
when importing .pyd files, which are loaded by absolute path and their 
dependencies found adjacent.


To clarify this - by "DLLs" I meant the DLLs directory, not DLLs in 
general (hence the singular "it's").


Cheers,
Steve
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/YIIMGEADBMT2CTOMIOPNNWZZ7UBE5JWC/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: os.add_dll_directory and DLL search order

2020-06-22 Thread Steve Dower

On 22Jun2020 1039, Seth G wrote:

However, there is one feature of using the Windows PATH that I can't seem to 
replicate with add_dll_directory.
The MapServer DLL builds are linked to sqlite3 3.24.0, whereas Python 3.8.2 is 
linked to 2.6.0. Building with matching versions is not something I can easily 
change.

When using Windows PATH the folder with the newer version could be added to the 
front of the list to take priority and import worked correctly. This does not 
seem to be possible with add_dll_directory - the Python sqlite3.dll in 
C:\Python38\DLLs always takes priority leading to:

ImportError: DLL load failed while importing _mapscript: The specified 
procedure could not be found.

I presume I can't remove the C:\Python38\DLLs path, is there another solution 
to this issue?


DLLs should not be in the search path at all - it's searched by sys.path 
when importing .pyd files, which are loaded by absolute path and their 
dependencies found adjacent.


What is likely happening here is that _sqlite3.pyd is being imported 
before _mapscript, and so there is already a SQLITE3 module in memory. 
Like Python, Windows will not attempt to import a second module with the 
same name, but will return the original one.


So your best option here is probably to rebuild sqlite3.dll with as much 
of the version number (or some unique string) in the name as you need. 
Or you can statically link it into your extension module, assuming you 
aren't relying on shared state with other modules in your package. The 
latter is probably easier.


Cheers,
Steve
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/PMBOWXNIXBGPPOOTDWU3LGR6QSP73AXW/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: Cython and incompatible C API changes

2020-06-22 Thread Stefan Behnel
Victor Stinner schrieb am 17.06.20 um 13:25:
> Le mer. 17 juin 2020 à 12:38, Petr Viktorin a écrit :
>>> There is an ongoing discussion about always requiring to run Cython
>>> when installing a C extension which uses Cython.
>>
>> Do you have a link to that discussion?

Yeah, I was wondering, too. :)


> Hum, I forgot where the discussion happened. Maybe it wasn't a proper
> "discussion", but just a few tweets:
> https://twitter.com/tacaswell/status/1266472526806474752
> 
> Thomas A Caswell wrote: "also, if you use cython please make it a
> build-time dependency and please don't put the generated c code in the
> sdists. cython can only handle the changes in the CPyhon c-api if you
> let it!"

So much for random opinions on the Internet. ;-)

I still recommend generating the C code on the maintainer side and then
shipping it. Both approaches have their pros and cons, but that's
definitely what I recommend.

First of all, making Cython a build time dependency and then pinning an
exact Cython version with it is entirely useless, because the C code that
Cython outputs is deterministic and you can just generate it on your side
and ship it. One dependency less, lots of user side complexity avoided. So,
the only case we're talking about here is allowing different (usually
newer) Cython versions to build your code.

If you ship the C file, then you know what you get and you don't depend on
whatever Cython version users have installed on their side. You avoid the
maintenance burden of having to respond to bug reports for seemingly
unrelated C code lines or bugs in certain Cython versions. The C code that
Cython generates is very intentionally adaptive to where you compile it and
we work hard to do all environment specific adaptations in the C code and
not in the code generator that creates it. It's the holy cow of "generate
once, compile everywhere". But obviously, it cannot take as-of-now unknown
future environmental changes into account, such as changes to the CPython
C-API.

If, instead, you use Cython at package build time, then you risk build
failures on user side due to users having a buggy Cython version installed
(which may not have existed when you shipped the package, so you couldn't
exclude it), or your code failing to compile with the installed Cython due
to incompatible language changes. However, if those (somewhat exceptional)
cases don't happen, then you may end up with a setting in which your code
adapts also to newer environments, by using a recent Cython version
automatically. That is definitely an advantage.

Basically, for maintained packages, I consider shipping the generated C
code the right way. Less hassle, easier debugging, better user experience.
For unmaintained packages, regenerating the C code at build time *can*
extend the lifetime of the package to newer environments for as long as it
does not run into failures due to Cython compiler changes (so you trade one
compatibility level for another one).

The question is whether the point at which a package becomes unmaintained
can ever be clear enough to make the switch. Regardless of which way you
choose, at some point in the future someone will have to do something,
either to your code or to your build setup, in order to prevent fatal bitrot.

Stefan
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/2K3IKBD4K7INMVV3LK6SJY6EXDDNC2M2/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] os.add_dll_directory and DLL search order

2020-06-22 Thread Seth G
Hi list,

I've been trying to get new Python 3.8 builds for the MapServer project - these 
are SWIG-generated bindings that require many DLLs to import in Python. Up 
until now this worked by adding a folder to the top of the Windows PATH. 

I've read through the discussions at https://bugs.python.org/issue36085 and 
understand the reasoning behind the change. 

Is the recommended solution in cases of Python wrappers to create a 
project-specific environment variable and allow users to set this, which in 
turn the library adds with add_dll_directory?

However, there is one feature of using the Windows PATH that I can't seem to 
replicate with add_dll_directory.
The MapServer DLL builds are linked to sqlite3 3.24.0, whereas Python 3.8.2 is 
linked to 2.6.0. Building with matching versions is not something I can easily 
change. 

When using Windows PATH the folder with the newer version could be added to the 
front of the list to take priority and import worked correctly. This does not 
seem to be possible with add_dll_directory - the Python sqlite3.dll in 
C:\Python38\DLLs always takes priority leading to:

ImportError: DLL load failed while importing _mapscript: The specified 
procedure could not be found.

I presume I can't remove the C:\Python38\DLLs path, is there another solution 
to this issue?

Thanks in advance for any help,

Seth

--
web:http://geographika.co.uk
twitter: @geographika
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/I6JAXBKV4Y7MVF42RJ4XQIPYQMAVXHOF/
Code of Conduct: http://python.org/psf/codeofconduct/


[Python-Dev] Re: PEP 620: Hide implementation details from the C API

2020-06-22 Thread Dong-hee Na
Thanks, Victor for awesome PEP

I am a big +1 on this proposal since some of the core developers
already need the evolution of C API.
I believe this proposal is not only for alternative python compiler
implementation but also gives a chance for enhancing CPython
performance.
And I love this proposal since the suggestion does not say break
everything in a one-shot but pursuing incremental change.

The one thing that we have to do for this PEP is that informing the
change very well to the 3rd party library.
With well-written changing documentation, I believe most of the
impactful 3rd party library which sustains Python library community
will have enough time for preparing change.

I believe that if there is no change there will be no evolution.
Let's make CPython more fastly with temporarily suffering


Regards from Korea,
Dong-hee

2020년 6월 22일 (월) 오후 9:13, Victor Stinner 님이 작성:
>
> Hi,
>
> PEP available at: https://www.python.org/dev/peps/pep-0620/
>
> 
> This PEP is the result of 4 years of research work on the C API:
> https://pythoncapi.readthedocs.io/
>
> It's the third version. The first version (2017) proposed to add a
> "new C API" and advised C extensions maintainers to opt-in for it: it
> was basically the same idea as PEP 384 limited C API but in a
> different color. Well, I had no idea of what I was doing :-) The
> second version (April 2020) proposed to add a new Python runtime built
> from the same code base as the regular Python runtime but in a
> different build mode, the regular Python would continue to be fully
> compatible.
>
> I wrote the third version, the PEP 620, from scratch. It now gives an
> explicit and concrete list of incompatible C API changes, and has
> better motivation and rationale sections. The main PEP novelty is the
> new pythoncapi_compat.h header file distributed with Python to provide
> new C API functions to old Python versions, the second novelty is the
> process to reduce the number of broken C extensions.
>
> Whereas PEPs are usually implemented in a single Python version, the
> implementation of this PEP is expected to be done carefully over
> multiple Python versions. The PEP lists many changes which are already
> implemented in Python 3.7, 3.8 and 3.9. It defines a process to reduce
> the number of broken C extensions when introducing the incompatible C
> API changes listed in the PEP. The process dictates the rhythm of
> these changes.
> 
>
>
> PEP: 620
> Title: Hide implementation details from the C API
> Author: Victor Stinner 
> Status: Draft
> Type: Standards Track
> Content-Type: text/x-rst
> Created: 19-June-2020
> Python-Version: 3.10
>
> Abstract
> 
>
> Introduce C API incompatible changes to hide implementation details.
>
> Once most implementation details will be hidden, evolution of CPython
> internals would be less limited by C API backward compatibility issues.
> It will be way easier to add new features.
>
> It becomes possible to experiment with more advanced optimizations in CPython
> than just micro-optimizations, like tagged pointers.
>
> Define a process to reduce the number of broken C extensions.
>
> The implementation of this PEP is expected to be done carefully over
> multiple Python versions. It already started in Python 3.7 and most
> changes are already completed. The `Process to reduce the number of
> broken C extensions`_ dictates the rhythm.
>
>
> Motivation
> ==
>
> The C API blocks CPython evolutions
> ---
>
> Adding or removing members of C structures is causing multiple backward
> compatibility issues.
>
> Adding a new member breaks the stable ABI (PEP 384), especially for
> types declared statically (e.g. ``static PyTypeObject MyType =
> {...};``). In Python 3.4, the PEP 442 "Safe object finalization" added
> the ``tp_finalize`` member at the end of the ``PyTypeObject`` structure.
> For ABI backward compatibility, a new ``Py_TPFLAGS_HAVE_FINALIZE`` type
> flag was required to announce if the type structure contains the
> ``tp_finalize`` member. The flag was removed in Python 3.8 (`bpo-32388
> `_).
>
> The ``PyTypeObject.tp_print`` member, deprecated since Python 3.0
> released in 2009, has been removed in the Python 3.8 development cycle.
> But the change broke too many C extensions and had to be reverted before
> 3.8 final release. Finally, the member was removed again in Python 3.9.
>
> C extensions rely on the ability to access directly structure members,
> indirectly through the C API, or even directly. Modifying structures
> like ``PyListObject`` cannot be even considered.
>
> The ``PyTypeObject`` structure is the one which evolved the most, simply
> because there was no other way to evolve CPython than modifying it.
>
> In the C API, all Python objects are passed as ``PyObject*``: a pointer
> to a ``PyObject`` structure. Experimenting tagged pointers in CPython is
> blocked by the fact that a C extension can technically dereference a
> 

[Python-Dev] PEP 620: Hide implementation details from the C API

2020-06-22 Thread Victor Stinner
Hi,

PEP available at: https://www.python.org/dev/peps/pep-0620/


This PEP is the result of 4 years of research work on the C API:
https://pythoncapi.readthedocs.io/

It's the third version. The first version (2017) proposed to add a
"new C API" and advised C extensions maintainers to opt-in for it: it
was basically the same idea as PEP 384 limited C API but in a
different color. Well, I had no idea of what I was doing :-) The
second version (April 2020) proposed to add a new Python runtime built
from the same code base as the regular Python runtime but in a
different build mode, the regular Python would continue to be fully
compatible.

I wrote the third version, the PEP 620, from scratch. It now gives an
explicit and concrete list of incompatible C API changes, and has
better motivation and rationale sections. The main PEP novelty is the
new pythoncapi_compat.h header file distributed with Python to provide
new C API functions to old Python versions, the second novelty is the
process to reduce the number of broken C extensions.

Whereas PEPs are usually implemented in a single Python version, the
implementation of this PEP is expected to be done carefully over
multiple Python versions. The PEP lists many changes which are already
implemented in Python 3.7, 3.8 and 3.9. It defines a process to reduce
the number of broken C extensions when introducing the incompatible C
API changes listed in the PEP. The process dictates the rhythm of
these changes.



PEP: 620
Title: Hide implementation details from the C API
Author: Victor Stinner 
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 19-June-2020
Python-Version: 3.10

Abstract


Introduce C API incompatible changes to hide implementation details.

Once most implementation details will be hidden, evolution of CPython
internals would be less limited by C API backward compatibility issues.
It will be way easier to add new features.

It becomes possible to experiment with more advanced optimizations in CPython
than just micro-optimizations, like tagged pointers.

Define a process to reduce the number of broken C extensions.

The implementation of this PEP is expected to be done carefully over
multiple Python versions. It already started in Python 3.7 and most
changes are already completed. The `Process to reduce the number of
broken C extensions`_ dictates the rhythm.


Motivation
==

The C API blocks CPython evolutions
---

Adding or removing members of C structures is causing multiple backward
compatibility issues.

Adding a new member breaks the stable ABI (PEP 384), especially for
types declared statically (e.g. ``static PyTypeObject MyType =
{...};``). In Python 3.4, the PEP 442 "Safe object finalization" added
the ``tp_finalize`` member at the end of the ``PyTypeObject`` structure.
For ABI backward compatibility, a new ``Py_TPFLAGS_HAVE_FINALIZE`` type
flag was required to announce if the type structure contains the
``tp_finalize`` member. The flag was removed in Python 3.8 (`bpo-32388
`_).

The ``PyTypeObject.tp_print`` member, deprecated since Python 3.0
released in 2009, has been removed in the Python 3.8 development cycle.
But the change broke too many C extensions and had to be reverted before
3.8 final release. Finally, the member was removed again in Python 3.9.

C extensions rely on the ability to access directly structure members,
indirectly through the C API, or even directly. Modifying structures
like ``PyListObject`` cannot be even considered.

The ``PyTypeObject`` structure is the one which evolved the most, simply
because there was no other way to evolve CPython than modifying it.

In the C API, all Python objects are passed as ``PyObject*``: a pointer
to a ``PyObject`` structure. Experimenting tagged pointers in CPython is
blocked by the fact that a C extension can technically dereference a
``PyObject*`` pointer and access ``PyObject`` members. Small "objects"
can be stored as a tagged pointer with no concrete ``PyObject``
structure.

Replacing Python garbage collector with a tracing garbage collector
would also need to remove ``PyObject.ob_refcnt`` reference counter,
whereas currently ``Py_INCREF()`` and ``Py_DECREF()`` macros access
directly to ``PyObject.ob_refcnt``.

Same CPython design since 1990: structures and reference counting
-

When the CPython project was created, it was written with one principle:
keep the implementation simple enough so it can be maintained by a
single developer. CPython complexity grew a lot and many
micro-optimizations have been implemented, but CPython core design has
not changed.

Members of ``PyObject`` and ``PyTupleObject`` structures have not
changed since the "Initial revision" commit (1990)::

#define OB_HEAD \
unsigned int ob_refcnt; \
struct _typeobject *ob_type;

typedef struct _object {
OB_HEAD

[Python-Dev] Re: Should we be making so many changes in pursuit of PEP 554?

2020-06-22 Thread Nick Coghlan
On Thu., 18 Jun. 2020, 6:06 am Eric Snow, 
wrote:

> On Wed, Jun 17, 2020 at 11:42 AM Emily Bowman 
> wrote:
> > So most likely there wouldn't be any way to share something like a
> bytearray or another
> > buffer interface-compatible type for some time. That's too bad, I was
> hoping to have
> > shared arrays that I could put a memoryview on in each
> thread/interpreter and deal with
> > locking if I need to,
>
> Earlier versions of PEP 554 did have a "SendChannel.send_buffer()"
> method for this but we tabled it in the interest of simplifying.  That
> said, I expect we'll add something like that separately later.
>

Right, buffers are different because the receiving interpreter can set up a
memoryview that refers to storage allocated by the source interpreter.

So the Python objects aren't shared (avoiding refcounting complications),
but the expensive data copying step can still be avoided.


> > Packages like NumPy have had their own opaque C types and C-only
> routines to handle all the big threading outside of Python as a workaround
> for a long time now.
>
> As a workaround for what?  This sounds interesting. :)
>

For the GIL - lots of NumPy operations are in pure C or FORTRAN and will
happily use as many CPUs as you have available.

Cheers,
Nick.



> -eric
>
___
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/VXLNERRZLDAFX725JCYEQ5WX3MK7DEE3/
Code of Conduct: http://python.org/psf/codeofconduct/