Neville, I think we should be able to fix the two bugs you mentioned within
this week. I'll take a look. It would be great if you can provide more
details in the JIRAs (e.g., test case to reproduce). Array currently
doesn't expose a bitmask API, and I don't think we need specialized
Hi Wes,
In Rust, we have 2 bugs (https://issues.apache.org/jira/browse/ARROW-4914,
https://issues.apache.org/jira/browse/ARROW-4886) both related to array
slicing.
In summary:
* ARROW-4914, the bitmask of the original array is used to determine the
validity of the sliced array, but offsets
e is at offset: 144
> > Message length: 140
> > About to read body, file at offset: 288
> > Read message body, file at offset: 320
> > Opening a Message flatbuffer with size 140
> > File is at offset: 320
> >
> > So it seems the Flatbuffers librar
pening a Message flatbuffer with size 140
> File is at offset: 320
>
> So it seems the Flatbuffers library recognizes bytes 4 through 144 as a
> Message
>
> I put my branch here:
> https://github.com/wesm/arrow/tree/ipc-debug-print-20190318
>
> The test.py is here
> https:/
/wesm/arrow/tree/ipc-debug-print-20190318
The test.py is here
https://gist.github.com/wesm/dd40aa3196cd138e883d94c574d154f9
BTW can you comment on
https://github.com/ExpandingMan/Arrow.jl/issues/28? I would like to
see a Julia implementation inside the Apache Arrow project.
Thanks
Wes
On Mon, Mar
Philipp Moritz created ARROW-4958:
-
Summary: [C++] Purely static linking broken
Key: ARROW-4958
URL: https://issues.apache.org/jira/browse/ARROW-4958
Project: Apache Arrow
Issue Type:
hi folks,
I think we're basically at the 0.13 end game here. There's some more
patches can get in, but do we all think we can cut an RC by the end of
the week? What are the blocking issues?
Thanks
Wes
On Sat, Mar 16, 2019 at 9:57 PM Kouhei Sutou wrote:
>
> Hi,
>
> > Submitted the packaging
Hello all, I am working on a pure Julia implementation of the arrow standard.
Currently I am working on ingesting the metadata, and it seems to me that the
output I'm creating with `pyarrow` is not matching the format, so I'm trying to
figure out where I've misunderstood it.
I've written some
Andy Grove created ARROW-4957:
-
Summary: [Rust] [DataFusion] Implement get_supertype correctly
Key: ARROW-4957
URL: https://issues.apache.org/jira/browse/ARROW-4957
Project: Apache Arrow
Issue
Prashanth Govindarajan created ARROW-4956:
-
Summary: Allow ArrowBuffers to wrap external Memory in C#
Key: ARROW-4956
URL: https://issues.apache.org/jira/browse/ARROW-4956
Project: Apache
We are having a discussion on
https://github.com/apache/arrow/pull/3925#issuecomment-473605919 about the
`MemoryPool` class in the C# library.
In reality, the way `MemoryPool` is designed in C#, it is more of a
"MemoryAllocator" - it just allocates or reallocates memory. There is no API
for
Kouhei Sutou created ARROW-4955:
---
Summary: [GLib] Add garrow_file_is_closed()
Key: ARROW-4955
URL: https://issues.apache.org/jira/browse/ARROW-4955
Project: Apache Arrow
Issue Type: New
hi folks,
If I'm reading the tea leaves correctly, there is increased interest
in developing a full-fledged "embeddable" analytical query engine
written in C++ within the Apache Arrow project. I have long been a
major proponent of this and have written and spoken public about it on
numerous
Antoine Pitrou created ARROW-4954:
-
Summary: [Python] test failure with Flight enabled
Key: ARROW-4954
URL: https://issues.apache.org/jira/browse/ARROW-4954
Project: Apache Arrow
Issue Type:
Dominic Sisneros created ARROW-4953:
---
Summary: [Ruby] Not loading libarrow-glib
Key: ARROW-4953
URL: https://issues.apache.org/jira/browse/ARROW-4953
Project: Apache Arrow
Issue Type: Bug
Antoine Pitrou created ARROW-4952:
-
Summary: Equals / ApproxEquals behaviour undefined on FP NaNs
Key: ARROW-4952
URL: https://issues.apache.org/jira/browse/ARROW-4952
Project: Apache Arrow
Krisztian Szucs created ARROW-4951:
--
Summary: [C++] Turn off cpp benchmarks in cpp docker images
Key: ARROW-4951
URL: https://issues.apache.org/jira/browse/ARROW-4951
Project: Apache Arrow
Krisztian Szucs created ARROW-4949:
--
Summary: [CI] Add C# docker image to the docker-compose setup
Key: ARROW-4949
URL: https://issues.apache.org/jira/browse/ARROW-4949
Project: Apache Arrow
Uwe L. Korn created ARROW-4948:
--
Summary: [JS] Nightly test failing with "Cannot assign to read
only property"
Key: ARROW-4948
URL: https://issues.apache.org/jira/browse/ARROW-4948
Project: Apache Arrow
I would advise against mixing these environments. Either use only packages from
defaults or from conda-forge as things like boost from defaults and boost from
conda-forge can be installed at the same time but then lead to segfaults.
Uwe
On Mon, Mar 18, 2019, at 2:10 PM, Antoine Pitrou wrote:
>
I'd prefer Make / MakeUnsafe as well.
Regards
Antoine.
Le 11/03/2019 à 15:15, Francois Saint-Jacques a écrit :
> I can settle with Micah's proposal of `MakeValidate`, though I'd prefer
> that Make is safe by default and responsible users use MakeUnsafe :), I'll
> settle with the pragmatic
Well, in the meantime I can just use the conda-forge packages.
(though there are regular issues when updating packages where conda
switches back and forth from Anaconda and conda-forge packages)
Regards
Antoine.
Le 18/03/2019 à 13:59, Uwe L. Korn a écrit :
> Hello Antoine,
>
> you're
David Li created ARROW-4947:
---
Summary: [Flight][C++/Python] Remove redundant schema parameter in
DoGet
Key: ARROW-4947
URL: https://issues.apache.org/jira/browse/ARROW-4947
Project: Apache Arrow
XXH3 (by the xxhash author) was recently presented, though it's still
experimental for now:
https://fastcompression.blogspot.com/2019/03/presenting-xxh3.html
It is claimed to be significantly faster than xxhash, on all message sizes.
Regards
Antoine.
Le 06/03/2019 à 07:06, Micah Kornfield a
Hello Antoine,
you're running into https://github.com/ContinuumIO/anaconda-issues/issues/10731
I would rather have Anaconda fix this but we can also add alternative detection
for this. I've opened https://issues.apache.org/jira/browse/ARROW-4946, I can
then look into this the next
David Li created ARROW-4945:
---
Summary: [Flight] Enable Flight integration tests in Travis
Key: ARROW-4945
URL: https://issues.apache.org/jira/browse/ARROW-4945
Project: Apache Arrow
Issue Type:
I see, thanks.
Regards
Antoine.
Le 18/03/2019 à 13:57, Wes McKinney a écrit :
> See https://issues.apache.org/jira/browse/ARROW-4879
>
> CMake support for Flatbuffers was only very recently made available in
> the 1.10.1 conda-forge package (see [1]); it will have to be fixed in
>
Uwe L. Korn created ARROW-4946:
--
Summary: [C++] Support detection of flatbuffers without
FlatbuffersConfig.cmake
Key: ARROW-4946
URL: https://issues.apache.org/jira/browse/ARROW-4946
Project: Apache
See https://issues.apache.org/jira/browse/ARROW-4879
CMake support for Flatbuffers was only very recently made available in
the 1.10.1 conda-forge package (see [1]); it will have to be fixed in
Anaconda's repo. In the meantime if you want to use Anaconda packages
instead of conda-forge (not
hi Antoine,
I have addressed this issue in the documentation patch I just did
https://github.com/apache/arrow/blob/master/docs/source/developers/cpp.rst#individual-dependency-resolution
passing
-Ddouble-conversion_SOURCE=BUNDLED
should do the trick
- Wes
On Mon, Mar 18, 2019 at 7:51 AM
Ok, so I have a problem. I had the following line:
export DOUBLE_CONVERSION_HOME=
which was used to force double-conversion to be built from source
despite other dependencies being taken from the Conda environment. Now
it doesn't work anymore, and I haven't found how to emulate it.
Hi Uwe,
I am working on integrating uriparser into Arrow and I am using the
CMake support indeed:
https://github.com/apache/arrow/compare/master...pitrou:ARROW-4697-cpp-uri-parsing
Regards
Antoine.
On Mon, 18 Mar 2019 05:58:26 -0400
"Uwe L. Korn" wrote:
> I have looked into packaging
Uwe L. Korn created ARROW-4944:
--
Summary: [C++] Raise minimal required thrift-cpp to 0.11 in conda
environment
Key: ARROW-4944
URL: https://issues.apache.org/jira/browse/ARROW-4944
Project: Apache Arrow
vanderliang created ARROW-4943:
--
Summary: pyarrow.lib.HadoopFileSystem._connect failed due to
TypeError
Key: ARROW-4943
URL: https://issues.apache.org/jira/browse/ARROW-4943
Project: Apache Arrow
I have looked into packaging uriparser for conda-forge to have it available in
our build system and it feels to me that we rather should wait for their next
release where they switch the build system over to CMake. This will make
packaging in conda and the ExternalProject much easier. Not sure
Kenta Murata created ARROW-4942:
---
Summary: [Ruby] Remove needless omits
Key: ARROW-4942
URL: https://issues.apache.org/jira/browse/ARROW-4942
Project: Apache Arrow
Issue Type: Test
36 matches
Mail list logo