[jira] [Created] (ARROW-5693) [Go] skip IPC integration test for Decimal128
Sebastien Binet created ARROW-5693: -- Summary: [Go] skip IPC integration test for Decimal128 Key: ARROW-5693 URL: https://issues.apache.org/jira/browse/ARROW-5693 Project: Apache Arrow Issue Type: Bug Components: Go Reporter: Sebastien Binet Assignee: Sebastien Binet -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-5692) [Go] implement minimal JSON support for Decimal128
Sebastien Binet created ARROW-5692: -- Summary: [Go] implement minimal JSON support for Decimal128 Key: ARROW-5692 URL: https://issues.apache.org/jira/browse/ARROW-5692 Project: Apache Arrow Issue Type: Improvement Components: Go Reporter: Sebastien Binet Assignee: Sebastien Binet -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (ARROW-5691) [C++] Relocate src/parquet/arrow code to src/arrow/dataset/parquet
Wes McKinney created ARROW-5691: --- Summary: [C++] Relocate src/parquet/arrow code to src/arrow/dataset/parquet Key: ARROW-5691 URL: https://issues.apache.org/jira/browse/ARROW-5691 Project: Apache Arrow Issue Type: Improvement Components: C++ Reporter: Wes McKinney I think it may make sense to continue developing and maintaining this code in the same place as other file format <-> Arrow serialization code and dataset handling routines (e.g. schema normalization). Under this scheme, libparquet becomes a link time dependency of libarrow_dataset -- This message was sent by Atlassian JIRA (v7.6.3#76005)
Go becomes the 4th language to be a part of the Arrow integration tests (!)
I'm excited to announce that Go has become the 4th language to officially participate in the Arrow binary protocol integration tests, after Java, C++, and JavaScript: https://github.com/apache/arrow/commit/4ba2763150459c9eb4139e5954d9b5526b8ef0ee This is a huge milestone toward making Go a first-class citizen in the Apache Arrow world. Congrats to Sebastien, Stuart, Alexandre, and the rest of the Go contributors! - Wes
[jira] [Created] (ARROW-5690) [Packaging] macOS wheels broken: libprotobuf.18.dylib missing
Philipp Moritz created ARROW-5690: - Summary: [Packaging] macOS wheels broken: libprotobuf.18.dylib missing Key: ARROW-5690 URL: https://issues.apache.org/jira/browse/ARROW-5690 Project: Apache Arrow Issue Type: Improvement Reporter: Philipp Moritz If I build macOS arrow wheels with crossbow from the latest master (a77257f4790c562dcb74724fc4a22c157ab36018) and install them, importing pyarrow gives the following error message: {code:java} In [1]: import pyarrow --- ImportError Traceback (most recent call last) in > 1 import pyarrow ~/anaconda3/lib/python3.6/site-packages/pyarrow/__init__.py in 47 import pyarrow.compat as compat 48 ---> 49 from pyarrow.lib import cpu_count, set_cpu_count 50 from pyarrow.lib import (null, bool_, 51 int8, int16, int32, int64, ImportError: dlopen(/Users/pcmoritz/anaconda3/lib/python3.6/site-packages/pyarrow/lib.cpython-36m-darwin.so, 2): Library not loaded: /usr/local/opt/protobuf/lib/libprotobuf.18.dylib Referenced from: /Users/pcmoritz/anaconda3/lib/python3.6/site-packages/pyarrow/libarrow.14.dylib Reason: image not found{code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)
Re: How should a Python/C++ project depend on Arrow (issues with ABI and wheel)?
> > > Zhuo Peng wrote: > > fundamental libraries [5]. Newer GCC versions could be backported to work > > with older libraries [6]. That would be great and it would require a design agreement between major compilers (Gcc, Intel, LLVM and Portland etc). > then you need to use the same toolchain > (or an ABI-compatible toolchain, but I'm afraid there's no clear > specification of ABI compatibility in g++ / libstdc++ land). > A structural deficiency (and flaw) of unix where the sdk is the live system. RH did some work in that providing packages but they guaranteed compatibility across a limited number of versions and only for RH. You know, I wish the scientific communities would stop producing wheels > and instead encourage users to switch to conda. The wheel paradigm is > conceptually antiquated and is really a nuisance to package developers > and maintainers. I cannot agree more and it is not only restricted to the scientific community (I would consider myself working in a hybrid env). Maybe we should embrace it. We could agree on a common "package" format (bin/lib/include/data) between pip and conda to begin with (pretty much like rpm based on cpio instead zip). While there might disagreements on the tooling around (the build and the dependency resolver), at least we can pin on single format and the installer (with bare minimal logic in it). Seminal projects in this space are IMHO: https://build.opensuse.org/ (or the CD/CI system before it became "fashionable") Basically it allows to create a package for each "binary" platform instead a package that rules them all, automatically (embrace it if you cannot beat it) https://github.com/QuantStack/mamba (it's a step in the right direction to split the install part from the dependency resolution) (https://medium.com/@wolfv/making-conda-fast-again-4da4debfb3b7) Please let me know if that something interesting PS> rpm settled on a main package (bin+libs), -develop (for headers+static libraries), -debug to include symbols for debugging