Hi All,

We are working with the arrow version 0.9.0 C++ libraries in conjunction with 
separate parquet-cpp version 1.4.0.

Questions:

  1.  Does this zlib issue affect all clients of the arrow C++ libraries or 
just the Python PyArrow code?
  2.  Does this zlib compression issue also affect the arrow version 0.9.0 C++ 
libraries (before parquet-cpp was merged in), or only the latest arrow version 
0.11.0 C++ libraries (with parquet-cpp merged in)?

Best Regards,

Kevin Gurney

________________________________
From: Krisztián Szűcs <szucs.kriszt...@gmail.com>
Sent: Thursday, October 18, 2018 5:31:01 AM
To: dev@arrow.apache.org
Subject: Re: Making a bugfix 0.11.1 release

I've added the two zlib issues to 0.11.1 version:
https://issues.apache.org/jira/projects/ARROW/versions/12344316

On Wed, Oct 17, 2018 at 10:51 PM Wes McKinney <wesmck...@gmail.com> wrote:

> Got it, thank you for clarifying. It wasn't clear whether the bug
> would occur in the build environment (CentOS 5 + devtoolset-2) as well
> as other Linux environments.
> On Wed, Oct 17, 2018 at 4:16 PM Antoine Pitrou <anto...@python.org> wrote:
> >
> >
> > Le 17/10/2018 à 20:38, Wes McKinney a écrit :
> > > hi folks,
> > >
> > > Since the Python wheels are being installed 10,000 times per day or
> > > more, I don't think we should allow them to be broken for much longer.
> > >
> > > What additional patches need to be done before an RC can be cut? Since
> > > I'm concerned about the broken patches undermining the project's
> > > reputation, I can adjust my priorities to start a release vote later
> > > today or first thing tomorrow morning. Seems like
> > > https://issues.apache.org/jira/browse/ARROW-3535 might be the last
> > > item, and I can prepare a maintenance branch with the cherry-picked
> > > fixes
> > >
> > > Was there a determination as to why our CI systems did not catch the
> > > blocker ARROW-3514?
> >
> > Because it was not exercised by the test suite.  My take is that the bug
> > would only happen with specific data, e.g. tiny and/or entirely
> > incompressible.  I don't think general gzip compression of Parquet files
> > was broken.
> >
> > Regards
> >
> > Antoine.
>

Reply via email to