On Thu, Sep 19, 2019 at 10:56 Antoine Pitrou <anto...@python.org> wrote:
> > Le 19/09/2019 à 19:52, Zhuo Peng a écrit : > > > > The problems are only potential and theoretical, and won't bite anyone > > until it occurs though, and it's more likely to happen with pip/wheel > than > > with conda. > > > > But anyways, this idea is still nice. I could imagine at least in arrow's > > Python-C-API, there would be a > > > > PyObject* pyarrow_array_from_c_protocol(ArrayArray*); > > > > this way the C++ APIs can be avoided while still allowing arrays to be > > created in C/C++ and used in python. > > Adding a Python C API function is a nice idea. > However, I *still* don't understand how it will solve your problem. The > Cython modules comprising PyArrow will still call the C++ APIs, with the > ABI problems that entails. Those calls are internal to libarrow.so and libarrow_python.so which always agrees on the ABI. It’s different from the client library having to create an arrow::Array which may contain, say a std::vector<int> from gcc5, then pass it to an Arrow C++ API exposed by libarrow.so, whose definition of std::vector<int> is from gcc7. > > > Regards > > Antoine. >