Re: [DataFusion] Introduce qualified alias expressions

2023-11-08 Thread Jeremy Dyer
>From an Arrow Datafusion Python consumer standpoint this would make using 
>datafusion so so much easier. Can’t speak to the ramifications to upstream or 
>other projects but would love to see this and help if needed

Thanks,
Jeremy Dyer

Get Outlook for iOS

From: Marko Grujic 
Sent: Wednesday, November 8, 2023 3:45:56 AM
To: dev@arrow.apache.org 
Subject: [DataFusion] Introduce qualified alias expressions

Hi all,

I would like to propose extending the `datafusion_expr::Expr` enum so as to
introduce a capability for aliases which can map to qualified
schemas/fields. Currently this is not possible, since the existing
`Expr::Alias` always maps to an unqualified field[1].

The detailed reasoning behind this is laid out in the accompanying issue
I've opened[2], and has to do with a problem I encountered while working on
a recent PR[3]. In brief, when a plan gets transformed during optimizations
into some other plan(s) which involve aggregation, there is currently no
way to abide by the original (qualified) schema, which is asserted as a
sanity check in-between each optimization[4].

I thus propose either adding a new enum variant (e.g.
`Expr::QualifiedAlias` in parallel with `Expr::QualifiedWildcard`), or
extending the existing `Expr::Alias` with an optional relation/qualifier.

I would love to know any thoughts on this, and whether this discussion is
perhaps better suited to the Discord channel or someplace else.

Thanks,
Marko

[1]
https://github.com/apache/arrow-datafusion/blob/656c6a93fadcec7bc43a8a881dfaf55388b0b5c6/datafusion/expr/src/expr_schema.rs#L285-L305
[2] https://github.com/apache/arrow-datafusion/issues/8008
[3]https://github.com/apache/arrow-datafusion/pull/7981
[4]
https://github.com/apache/arrow-datafusion/blob/724bafd4de98eff8d6ffd67942d29d0f9faf2aa3/datafusion/optimizer/src/optimizer.rs#L427-L452


Re: [VOTE] Release Apache Arrow ADBC 0.8.0 - RC0

2023-11-08 Thread David Li
With 3 binding and 2 non-binding +1 votes and no -1 votes, the vote passes. 
I'll begin on the post-release tasks.

On Tue, Nov 7, 2023, at 14:01, Dane Pitkin wrote:
> +1 (non-binding)
>
> Verified on M1 MacOS 13 with:
>
> USE_CONDA=1 TEST_YUM=0 TEST_APT=0 ./dev/release/verify-release-candidate.sh
> 0.8.0 0
>
> On Tue, Nov 7, 2023 at 9:10 AM Dewey Dunnington
>  wrote:
>
>> +1!
>>
>> I ran: TEST_APT=0 TEST_YUM=0 USE_CONDA=1
>> dev/release/verify-release-candidate.sh 0.8.0 0
>>
>> On Fri, Nov 3, 2023 at 12:18 PM David Li  wrote:
>> >
>> > Hello,
>> >
>> > I would like to propose the following release candidate (RC0) of Apache
>> Arrow ADBC version 0.8.0. This is a release consisting of 42 resolved
>> GitHub issues [1].
>> >
>> > This release candidate is based on commit:
>> 95f13231f49494bcf78df45de1f65aa25620981b [2]
>> >
>> > The source release rc0 is hosted at [3].
>> > The binary artifacts are hosted at [4][5][6][7][8].
>> > The changelog is located at [9].
>> >
>> > Please download, verify checksums and signatures, run the unit tests,
>> and vote on the release. See [10] for how to validate a release candidate.
>> >
>> > See also a verification result on GitHub Actions [11].
>> >
>> > The vote will be open for at least 72 hours.
>> >
>> > [ ] +1 Release this as Apache Arrow ADBC 0.8.0
>> > [ ] +0
>> > [ ] -1 Do not release this as Apache Arrow ADBC 0.8.0 because...
>> >
>> > Note: to verify APT/YUM packages on macOS/AArch64, you must `export
>> DOCKER_DEFAULT_PLATFORM=linux/amd64`. (Or skip this step by `export
>> TEST_APT=0 TEST_YUM=0`.)
>> >
>> > Note: it is not currently possible to verify with Conda and Python 3.12
>> (some test dependencies do not yet have a Python 3.12 build available). The
>> verification script defaults to Python 3.11. Binary artifacts are available
>> for 3.12.
>> >
>> > [1]:
>> https://github.com/apache/arrow-adbc/issues?q=is%3Aissue+milestone%3A%22ADBC+Libraries+0.8.0%22+is%3Aclosed
>> > [2]:
>> https://github.com/apache/arrow-adbc/commit/95f13231f49494bcf78df45de1f65aa25620981b
>> > [3]:
>> https://dist.apache.org/repos/dist/dev/arrow/apache-arrow-adbc-0.8.0-rc0/
>> > [4]: https://apache.jfrog.io/artifactory/arrow/almalinux-rc/
>> > [5]: https://apache.jfrog.io/artifactory/arrow/debian-rc/
>> > [6]: https://apache.jfrog.io/artifactory/arrow/ubuntu-rc/
>> > [7]:
>> https://repository.apache.org/content/repositories/staging/org/apache/arrow/adbc/
>> > [8]:
>> https://github.com/apache/arrow-adbc/releases/tag/apache-arrow-adbc-0.8.0-rc0
>> > [9]:
>> https://github.com/apache/arrow-adbc/blob/apache-arrow-adbc-0.8.0-rc0/CHANGELOG.md
>> > [10]:
>> https://arrow.apache.org/adbc/main/development/releasing.html#how-to-verify-release-candidates
>> > [11]: https://github.com/apache/arrow-adbc/actions/runs/6746653191
>>


Re: [VOTE][RUST][DataFusion] Release Apache Arrow DataFusion 33.0.0 RC1

2023-11-08 Thread Andrew Lamb
As a follow up here, I am working on an arrow 48.0.1 patch release [1] to
fix some outstanding issues and unblock the DataFusion 33.0.0 release.
Please comment on the ticket if you have any questions or other things you
would like to include in the release. Otherwise I plan to make a RC
tomorrow

[1] https://github.com/apache/arrow-rs/issues/5050#issuecomment-1802574764

On Mon, Nov 6, 2023 at 2:02 PM Andy Grove  wrote:

> I filed https://github.com/apache/arrow-datafusion/issues/8069
>
> On Mon, Nov 6, 2023 at 11:59 AM Andy Grove  wrote:
>
> > I see the same error when I run on my M1 Macbook Air with 16 GB RAM.
> >
> >  aggregates::tests::run_first_last_multi_partitions stdout 
> > Error: ResourcesExhausted("Failed to allocate additional 632 bytes for
> > GroupedHashAggregateStream[0] with 1829 bytes already allocated - maximum
> > available is 605")
> >
> > It worked fine on my workstation with 128 GB RAM.
> >
> >
> >
> > On Mon, Nov 6, 2023 at 11:23 AM L. C. Hsieh  wrote:
> >
> >> Hmm, ran verification script and got one failure:
> >>
> >> failures:
> >>
> >>  aggregates::tests::run_first_last_multi_partitions stdout 
> >> Error: ResourcesExhausted("Failed to allocate additional 632 bytes for
> >> GroupedHashAggregateStream[0] with 1829 bytes already allocated -
> >> maximum available is 605")
> >>
> >> failures:
> >> aggregates::tests::run_first_last_multi_partitions
> >>
> >> test result: FAILED. 557 passed; 1 failed; 1 ignored; 0 measured; 0
> >> filtered out; finished in 2.21s
> >>
> >>
> >>
> >> On Mon, Nov 6, 2023 at 6:57 AM Andy Grove 
> wrote:
> >> >
> >> > Hi,
> >> >
> >> > I would like to propose a release of Apache Arrow DataFusion
> >> Implementation,
> >> > version 33.0.0.
> >> >
> >> > This release candidate is based on commit:
> >> > 262f08778b8ec231d96792c01fc3e051640eb5d4 [1]
> >> > The proposed release tarball and signatures are hosted at [2].
> >> > The changelog is located at [3].
> >> >
> >> > Please download, verify checksums and signatures, run the unit tests,
> >> and
> >> > vote
> >> > on the release. The vote will be open for at least 72 hours.
> >> >
> >> > Only votes from PMC members are binding, but all members of the
> >> community
> >> > are
> >> > encouraged to test the release and vote with "(non-binding)".
> >> >
> >> > The standard verification procedure is documented at
> >> >
> >>
> https://github.com/apache/arrow-datafusion/blob/main/dev/release/README.md#verifying-release-candidates
> >> > .
> >> >
> >> > [ ] +1 Release this as Apache Arrow DataFusion 33.0.0
> >> > [ ] +0
> >> > [ ] -1 Do not release this as Apache Arrow DataFusion 33.0.0
> because...
> >> >
> >> > Here is my vote:
> >> >
> >> > +1
> >> >
> >> > [1]:
> >> >
> >>
> https://github.com/apache/arrow-datafusion/tree/262f08778b8ec231d96792c01fc3e051640eb5d4
> >> > [2]:
> >> >
> >>
> https://dist.apache.org/repos/dist/dev/arrow/apache-arrow-datafusion-33.0.0-rc1
> >> > [3]:
> >> >
> >>
> https://github.com/apache/arrow-datafusion/blob/262f08778b8ec231d96792c01fc3e051640eb5d4/CHANGELOG.md
> >>
> >
>


[VOTE][FORMAT] Bulk ingestion support for Flight SQL

2023-11-08 Thread David Li
Hello,

Joel Lubi has proposed adding bulk ingestion support to Arrow Flight SQL [1]. 
This provides a path for uploading an Arrow dataset to a Flight SQL server to 
create or append to a table, without having to know the specifics of the SQL or 
Substrait support on the server. The functionality mimics similar functionality 
in ADBC.

Joel has provided reference implementations of this for C++ and Go at [2], 
along with an integration test.

The vote will be open for at least 72 hours.

[ ] +1 Accept this proposal
[ ] +0
[ ] -1 Do not accept this proposal because...

[1]: https://lists.apache.org/thread/mo98rsh20047xljrbfymrks8f2ngn49z
[2]: https://github.com/apache/arrow/pull/38256

Thanks,
David


Re: [DISCUSS][MATLAB] Proposal for incremental point releases of the MATLAB interface

2023-11-08 Thread Kevin Gurney
Hi Kou and Dewey,

Thank you very much for your very thorough and detailed responses to all of our 
questions. This is extremely valuable feedback and the points that you made 
make alot of sense.

Sarah and I talked this over a bit more and we think that sticking with the 
overall apache/arrow project release cycle (i.e. stay in line with 15.0.0) 
makes the most sense in the long term.

@Dewey - thanks very much for highlighting the pros and cons of creating a 
separate repository. We also really appreciate the community being willing to 
try and support our development needs. That being said, we think it is probably 
best to stay in-model with the main apache/arrow release process for the time 
being rather than creating a separate repository for the MATLAB interface.

To address some related points and questions:

> Can we just mention "This is not stable yet!!!" in the documentation instead 
> of using isolated version?

Yes. This is good point and we already have a disclaimer in the README.md [1] 
for the MATLAB interface which says: "Warning The MATLAB interface is under 
active development and should be considered experimental."

> It's better that we use CI for this like other binary packages such as 
> .deb/.rpm/.wheel/.jar/...

This makes sense and we agree. We will follow up with PRs to add the necessary 
MATLAB packaging scripts and CI workflow files.

> Does the MLTBX file include Apache Arrow C++ binaries too like .wheel/.jar?

Yes. The MLTBX file will package the Apache Arrow C++ binaries, similar to the 
Java JARs / Python wheels.

> MATLAB doesn't provide the official package repository such as PyPI for 
> Python and https://rubygems.org/ for Ruby, right?

The equivalent to pypi.org or rubygems.org for MATLAB would be the MathWorks 
File Exchange [2].

> If the official package repository for MATLAB doesn't exist, JFrog is better 
> because the MLTBX file will be large (Apache Arrow C++ binaries are large).

As noted above, the "official package repository" for MATLAB would be the 
MathWorks File Exchange. File Exchange has tight integration with GitHub [3]. 
When a new release is available in GitHub Releases, the associated File 
Exchange entry will be automatically updated.

We believe we could leverage this integration between File Exchange and GitHub 
Releases to automate the MATLAB interface release process. This approach might 
look like:

1. Upload MLTBX to JFrog Artifactory
2. Run a post release script that would:
2.1 Download MLTBX from JFrog Artifactory
2.2 Upload to GitHub Releases (e.g. apache/arrow-matlab - see discussion below)
2.3 Linked File Exchange entry will be automatically updated

One open question about this approach: which GitHub repository should we use 
for hosting the MLTBX via GitHub Releases?

We don't think using the main apache/arrow GitHub Releases area is the right 
approach. So, would it make sense to create a separate "bridge" repository just 
for hosting the latest MLTBX files? Should this be an ASF associated repository 
like apache/arrow-matlab or would a MathWorks associated repository like 
mathworks/arrow-matlab be OK? We aren't sure what makes the most sense here, 
but welcome any suggestions.

> We may want to use the status page for it: 
> https://arrow.apache.org/docs/status.html

Thanks for highlighting this. This makes sense, and we can follow up with a PR 
to add MATLAB to the status page.

> How about creating https://arrow.apache.org/docs/matlab/ ? We can use Sphinx 
> like the Python docs https://arrow.apache.org/docs/python/ or another 
> documentation tools like the R docs https://arrow.apache.org/docs/r/ . If we 
> use Sphinx, we can create 
> https://github.com/apache/arrow/tree/main/docs/source/matlab/

This makes sense and eventually we want to have comprehensive documentation in 
line with other language bindings using Sphinx. In addition to comprehensive 
documentation, we were also hoping that we could host release notes in a place 
that is easily accessible from the MLTBX download location. File Exchange 
entries have a "Version History" which includes release notes from the 
"backing" GitHub Releases area. So, this would probably be a sensible location 
to put the release notes. Also, including MATLAB updates in Apache Arrow 
release blog posts (e.g. 
https://arrow.apache.org/blog/2023/11/01/14.0.0-release/) may also be helpful.

--

We really appreciate all of the community's guidance on navigating the release 
process!

We will get started on integrating with the existing release tooling.

[1] https://github.com/apache/arrow/tree/main/matlab#status
[2] https://www.mathworks.com/matlabcentral/fileexchange
[3] https://www.mathworks.com/matlabcentral/content/fx/about.html#Why_GitHub

Best Regards,

Kevin Gurney

From: Dewey Dunnington 
Sent: Tuesday, November 7, 2023 8:53 PM
To: dev@arrow.apache.org 
Cc: Sarah Gilmore ; Lei Hou 
Subject: Re: [DISCUSS][MATLAB] Proposal for incremental point 

[DISCUSS] [DataFusion] Unify Function Interface (remove BuiltInScalarFunction)

2023-11-08 Thread Andrew Lamb
I would like to bring wider visibility to a proposal[1] to unify the
function interface in DataFusion.

The TLDR is to remove BuiltInScalarFunction and ensure all functions can
implemented as `ScalarUDF`

There is a small API change to ScalarUDF proposed as well[2].

If you are interested, please leave comments on the ticket and/or PR.

Thank you,
Andrew

[1] https://github.com/apache/arrow-datafusion/issues/8045
[2] https://github.com/apache/arrow-datafusion/pull/8079


CVE-2023-47248: PyArrow, PyArrow: Arbitrary code execution when loading a malicious data file

2023-11-08 Thread Antoine Pitrou
Severity: critical

Affected versions:

- PyArrow 0.14.0 through 14.0.0
- PyArrow 0.14.0 through 14.0.0

Description:

Deserialization of untrusted data in IPC and Parquet readers in PyArrow 
versions 0.14.0 to 14.0.0 allows arbitrary code execution. An application is 
vulnerable if it reads Arrow IPC, Feather or Parquet data from untrusted 
sources (for example user-supplied input files).

This vulnerability only affects PyArrow, not other Apache Arrow implementations 
or bindings.

It is recommended that users of PyArrow upgrade to 14.0.1. Similarly, it is 
recommended that downstream libraries upgrade their dependency requirements to 
PyArrow 14.0.1 or later. PyPI packages are already available, and we hope that 
conda-forge packages will be available soon.

If it is not possible to upgrade, we provide a separate package 
`pyarrow-hotfix` that disables the vulnerability on older PyArrow versions. See 
 https://pypi.org/project/pyarrow-hotfix/  for instructions.

References:

https://arrow.apache.org/
https://www.cve.org/CVERecord?id=CVE-2023-47248



Re: [RESULT][VOTE] Release Apache Arrow 14.0.0 - RC2

2023-11-08 Thread Raúl Cumplido
Thanks Kou!

All the post-release tasks I usually do have been finished for 14.0.0:
[Done] Update the released milestone Date and set to "Closed" on GitHub
[Done] Merge changes on release branch to maintenance branch for patch releases
[Done] Add the new release to the Apache Reporter System
[Done] Upload source
[Done] Upload binaries
[Done] Update website --> https://github.com/apache/arrow-site/pull/424
[Done] Upload C# packages
[Done] Upload wheels/sdist to pypi
[Done] Publish Maven artifacts
[Done] Bump versions
[Done] Update tags for Go modules
[Done] Update Homebrew packages --> Community updated
https://github.com/Homebrew/homebrew-core/pull/153020
[Done] Upload JavaScript packages
[Done] Publish release blog posts
[Done] Announce the new release
[Done] Announce the release on Twitter (I've tweeted on my personal
twitter as I don't have access to the Arrow one)
[Done] Update docs --> https://github.com/apache/arrow-site/pull/425
[Done] Update MSYS2 package -->
https://github.com/msys2/MINGW-packages/pull/19028
[Done] Update vcpkg port --> https://github.com/microsoft/vcpkg/pull/34944
[Done] Update conda recipes -->
https://github.com/conda-forge/arrow-cpp-feedstock/pull/1201
[Done] Update version in Apache Arrow Cookbook
[Done] Update Conan recipe -->
https://github.com/conan-io/conan-center-index/pull/20980
[Done] Upload RubyGems
[Done] Remove old artifacts

The following ones are still missing but are usually done by the R and
Parquet maintainers:
[TODO] Update R packages
[TODO] Make the CPP PARQUET related version as "RELEASED" on JIRA
[TODO] Start the new version on JIRA for the related CPP PARQUET version

Thanks everyone!
Raúl

El mié, 8 nov 2023 a las 10:28, Sutou Kouhei () escribió:
>
> Hi,
>
> > [TODO] Upload RubyGems
>
> I've done this because I need this for Apache Arrow Flight
> SQL adapter for PostgreSQL.
>
> Thanks,
> --
> kou
>
> In 
>   "Re: [RESULT][VOTE] Release Apache Arrow 14.0.0 - RC2" on Mon, 6 Nov 2023 
> 19:01:15 +0100,
>   Raúl Cumplido  wrote:
>
> > Hi,
> >
> > I am currently working on the post-release tasks. This is the current 
> > status:
> >
> > [Done] Update the released milestone Date and set to "Closed" on GitHub
> > [Done] Merge changes on release branch to maintenance branch for patch 
> > releases
> > [Done] Add the new release to the Apache Reporter System
> > [Done] Upload source
> > [Done] Upload binaries
> > [Done] Update website --> https://github.com/apache/arrow-site/pull/424
> > [Done] Upload C# packages
> > [Done] Upload wheels/sdist to pypi
> > [Done] Publish Maven artifacts
> > [Done] Bump versions
> > [Done] Update tags for Go modules
> > [Done] Update Homebrew packages --> Community updated
> > https://github.com/Homebrew/homebrew-core/pull/153020
> > [Done] Upload JavaScript packages
> >
> > [In progress] Update MSYS2 package -->
> > https://github.com/msys2/MINGW-packages/pull/19028
> > [In progress] Update docs --> https://github.com/apache/arrow-site/pull/425
> > [In progress] Update vcpkg port -->
> > https://github.com/microsoft/vcpkg/pull/34944
> >
> > [TODO] Upload RubyGems
> > [TODO] Update Conan recipe
> > [TODO] Update version in Apache Arrow Cookbook
> > [TODO] Announce the new release
> > [TODO] Publish release blog posts
> > [TODO] Announce the release on Twitter
> > [TODO] Remove old artifacts
> >
> > As usual I am going to need help with the following tasks:
> > [TODO] Update R packages
> > [TODO] Make the CPP PARQUET related version as "RELEASED" on JIRA
> > [TODO] Start the new version on JIRA for the related CPP PARQUET version
> > [In progress] Update conda recipes -->
> > https://github.com/conda-forge/arrow-cpp-feedstock/pull/1201
> >
> > El mar, 31 oct 2023 a las 15:19, Raúl Cumplido () 
> > escribió:
> >>
> >> Hi,
> >>
> >> Thanks everyone.
> >>
> >> The result of the vote was successful with 3 +1 binding votes, 3 +1
> >> non-binding vote and no -1 votes.
> >> I will start the post release tasks for 14.0.0 [1].
> >>
> >> Thanks,
> >> Raúl
> >>
> >> [1] 
> >> https://arrow.apache.org/docs/dev/developers/release.html#post-release-tasks
> >>
> >> El lun, 30 oct 2023 a las 18:20, Jean-Baptiste Onofré
> >> () escribió:
> >> >
> >> > Hi,
> >> >
> >> > I just verified the Java part on Mac, not Go.
> >> >
> >> > I can do a try.
> >> >
> >> > Regards
> >> > JB
> >> >
> >> > On Fri, Oct 27, 2023 at 5:07 PM Dane Pitkin
> >> >  wrote:
> >> > >
> >> > > Has anyone fully verified on MacOS yet? Last time I tried, I was still
> >> > > getting stuck on the original Go issue after syncing Kou's latest 
> >> > > changes
> >> > > (I might need to try a fresh install of arrow).
> >> > >
> >> > > On Wed, Oct 25, 2023 at 9:34 AM David Li  wrote:
> >> > >
> >> > > > +1 (binding)
> >> > > >
> >> > > > Tested on Debian 12 'bookworm' / x86_64
> >> > > >
> >> > > > I had some trouble with the wheels but creating a fresh ARROW_TMPDIR
> >> > > > resolved it. I filed [1] as well since testing binaries with the 
> >> > > > apparent
> >> > > > rate-limit is 

Re: [RESULT][VOTE] Release Apache Arrow 14.0.0 - RC2

2023-11-08 Thread Sutou Kouhei
Hi,

> [TODO] Upload RubyGems

I've done this because I need this for Apache Arrow Flight
SQL adapter for PostgreSQL.

Thanks,
-- 
kou

In 
  "Re: [RESULT][VOTE] Release Apache Arrow 14.0.0 - RC2" on Mon, 6 Nov 2023 
19:01:15 +0100,
  Raúl Cumplido  wrote:

> Hi,
> 
> I am currently working on the post-release tasks. This is the current status:
> 
> [Done] Update the released milestone Date and set to "Closed" on GitHub
> [Done] Merge changes on release branch to maintenance branch for patch 
> releases
> [Done] Add the new release to the Apache Reporter System
> [Done] Upload source
> [Done] Upload binaries
> [Done] Update website --> https://github.com/apache/arrow-site/pull/424
> [Done] Upload C# packages
> [Done] Upload wheels/sdist to pypi
> [Done] Publish Maven artifacts
> [Done] Bump versions
> [Done] Update tags for Go modules
> [Done] Update Homebrew packages --> Community updated
> https://github.com/Homebrew/homebrew-core/pull/153020
> [Done] Upload JavaScript packages
> 
> [In progress] Update MSYS2 package -->
> https://github.com/msys2/MINGW-packages/pull/19028
> [In progress] Update docs --> https://github.com/apache/arrow-site/pull/425
> [In progress] Update vcpkg port -->
> https://github.com/microsoft/vcpkg/pull/34944
> 
> [TODO] Upload RubyGems
> [TODO] Update Conan recipe
> [TODO] Update version in Apache Arrow Cookbook
> [TODO] Announce the new release
> [TODO] Publish release blog posts
> [TODO] Announce the release on Twitter
> [TODO] Remove old artifacts
> 
> As usual I am going to need help with the following tasks:
> [TODO] Update R packages
> [TODO] Make the CPP PARQUET related version as "RELEASED" on JIRA
> [TODO] Start the new version on JIRA for the related CPP PARQUET version
> [In progress] Update conda recipes -->
> https://github.com/conda-forge/arrow-cpp-feedstock/pull/1201
> 
> El mar, 31 oct 2023 a las 15:19, Raúl Cumplido () escribió:
>>
>> Hi,
>>
>> Thanks everyone.
>>
>> The result of the vote was successful with 3 +1 binding votes, 3 +1
>> non-binding vote and no -1 votes.
>> I will start the post release tasks for 14.0.0 [1].
>>
>> Thanks,
>> Raúl
>>
>> [1] 
>> https://arrow.apache.org/docs/dev/developers/release.html#post-release-tasks
>>
>> El lun, 30 oct 2023 a las 18:20, Jean-Baptiste Onofré
>> () escribió:
>> >
>> > Hi,
>> >
>> > I just verified the Java part on Mac, not Go.
>> >
>> > I can do a try.
>> >
>> > Regards
>> > JB
>> >
>> > On Fri, Oct 27, 2023 at 5:07 PM Dane Pitkin
>> >  wrote:
>> > >
>> > > Has anyone fully verified on MacOS yet? Last time I tried, I was still
>> > > getting stuck on the original Go issue after syncing Kou's latest changes
>> > > (I might need to try a fresh install of arrow).
>> > >
>> > > On Wed, Oct 25, 2023 at 9:34 AM David Li  wrote:
>> > >
>> > > > +1 (binding)
>> > > >
>> > > > Tested on Debian 12 'bookworm' / x86_64
>> > > >
>> > > > I had some trouble with the wheels but creating a fresh ARROW_TMPDIR
>> > > > resolved it. I filed [1] as well since testing binaries with the 
>> > > > apparent
>> > > > rate-limit is quite frustrating
>> > > >
>> > > > [1]: https://github.com/apache/arrow/issues/38442
>> > > >
>> > > > On Wed, Oct 25, 2023, at 04:58, Raúl Cumplido wrote:
>> > > > > +1 (non-binding)
>> > > > >
>> > > > > I've successfully run:
>> > > > > TEST_DEFAULT=0 TEST_SOURCE=1 dev/release/verify-release-candidate.sh
>> > > > 14.0.0 2
>> > > > > TEST_DEFAULT=0 TEST_APT=1 dev/release/verify-release-candidate.sh 
>> > > > > 14.0.0
>> > > > 2
>> > > > > TEST_DEFAULT=0 TEST_BINARY=1 dev/release/verify-release-candidate.sh
>> > > > 14.0.0 2
>> > > > > TEST_DEFAULT=0 TEST_JARS=1 dev/release/verify-release-candidate.sh
>> > > > 14.0.0 2
>> > > > > TEST_DEFAULT=0 TEST_WHEELS=1 dev/release/verify-release-candidate.sh
>> > > > 14.0.0 2
>> > > > > TEST_DEFAULT=0 TEST_YUM=1 dev/release/verify-release-candidate.sh 
>> > > > > 14.0.0
>> > > > 2
>> > > > >
>> > > > > I have been able to successfully run the amazon linux 2023 yum 
>> > > > > package
>> > > > > by removing all my old docker images locally and pulling them again.
>> > > > >
>> > > > > With:
>> > > > >   * Python 3.10.12
>> > > > >   * gcc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0
>> > > > >   * NVIDIA CUDA Build cuda_11.5.r11.5/compiler.30672275_0
>> > > > >   * openjdk 17.0.8.1 2023-08-24
>> > > > >   * ruby 3.0.2p107 (2021-07-07 revision 0db68f0233) 
>> > > > > [x86_64-linux-gnu]
>> > > > >   * dotnet 7.0.112
>> > > > >   * Ubuntu 22.04 LTS
>> > > > >
>> > > > > El mié, 25 oct 2023 a las 9:24, Sutou Kouhei ()
>> > > > escribió:
>> > > > >>
>> > > > >> Hi,
>> > > > >>
>> > > > >> https://github.com/apache/arrow/pull/38450 should fix it.
>> > > > >> This is not a blocker because this is a verification script
>> > > > >> problem not source archive problem.
>> > > > >>
>> > > > >>
>> > > > >> Thanks,
>> > > > >> --
>> > > > >> kou
>> > > > >>
>> > > > >> In 
>> > > > >> 
>> > > > >>   "Re: [VOTE] Release Apache Arrow 14.0.0 - RC2" on Tue, 24 Oct 2023
>> > > > 20:13:57