Hello,

Nowadays we have an official mechanism for third-party C extensions to
be binary-compatible accross feature releases of Python: the stable ABI.

But, for non-stable ABI-using C extensions, there are also mechanisms
in place to *try* and ensure binary compatibility.  One of them is the
way in which we add tp_ slots to the PyTypeObject structure.

Typically, when adding a tp_XXX slot, you also need to add a
Py_TPFLAGS_HAVE_XXX type flag to signal those static type structures
that have been compiled against a recent enough PyTypeObject
definition.  This way, extensions compiled against Python N-1 are
supposed to "still work": as they don't have Py_TPFLAGS_HAVE_XXX set,
the core Python runtime won't try to access the (non-existing) tp_XXX
member.

However, beside internal code complication, it means you need to add a
new Py_TPFLAGS_HAVE_XXX each time we add a slot.  Since we have only 32
such bits available (many of them already taken), it is a very limited
resource. Is it worth it? (*) Can an extension compiled against Python
N-1 really claim to be compatible with Python N, despite other possible
differences?

(*) we can't extend the tp_flags field to 64 bits, precisely because of
the binary compatibility problem...

Regards

Antoine.


_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to