Wanted to update everyone here regarding the ability to cut a release
candidate for 0.10.0.

The last remaining set of tasks is to be able to use the new packaging tool
(crossbow.py) to build binary artifacts from a source archive. What this
means is that we'll have to move the release scripts into crossbow and
upload to either git releases (so that people without write permissions to
the SVN repo can test the ability to build binaries against a source
archive) or SVN (for the actual release candidate).

If we're concerned about the timeline, I will cut the release candidate
without building against a source archive.

I'm only available to work on this outside of work hours, so I'll be
cranking away later today.

FYI, it will probably take me a few tries to get the release scripts
working with SVN + binary artifacts so there may be some addition and
removal of a bunch of large-ish files in the SVN history. My SVN skills are
woefully out of date.

On Fri, Jul 27, 2018 at 6:43 PM Bryan Cutler <cutl...@gmail.com> wrote:

> Sure, added to dev/README
>
> On Fri, Jul 27, 2018 at 1:16 PM, Wes McKinney <wesmck...@gmail.com> wrote:
>
> > OK, can you add instructions to the README, or we can also use the
> > developer wiki as a lighter-weight option (not requiring PRs)? We need
> > to be more consistently organized about developer documentation since
> > the project is expanding rapidly
> >
> > On Fri, Jul 27, 2018 at 3:25 PM, Bryan Cutler <cutl...@gmail.com> wrote:
> > > The commands to run are in the Dockerfile for right now, however, you
> do
> > > need to use the patched version of Spark from my branch I linked in
> > > ARROW-2914 because there have been Java API changes.
> > >
> > > On Fri, Jul 27, 2018 at 12:18 PM, Wes McKinney <wesmck...@gmail.com>
> > wrote:
> > >
> > >> I dealt with some of the remaining unassigned issues. There are
> > >> several left where the PR author does not have a JIRA account or I
> > >> wasn't able to find their ID to add them to the "contributor" role.
> > >>
> > >> @Li, I think you can run (Bryan, is this right?)
> > >>
> > >> dev/run_docker_compose.sh spark_integration
> > >>
> > >> this needs to be documented here
> > >>
> > >> https://github.com/apache/arrow/tree/master/dev#integration-testing
> > >>
> > >> On Fri, Jul 27, 2018 at 1:12 PM, Bryan Cutler <cutl...@gmail.com>
> > wrote:
> > >> > Like, I ran the Spark integration after ARROW-2914 but I'll run
> again
> > >> later
> > >> > today since there might be some changes.
> > >> >
> > >> > On Fri, Jul 27, 2018, 7:11 AM Li Jin <ice.xell...@gmail.com> wrote:
> > >> >
> > >> >> @Bryan I noticed there are integration tests setup with Spark. We
> > should
> > >> >> probably run those before cutting the RC but I don't know how those
> > are
> > >> >> run.
> > >> >> If you can help running those or show me how to run those it would
> be
> > >> >> great!
> > >> >>
> > >>
> >
>

Reply via email to