Sure, added to dev/README

On Fri, Jul 27, 2018 at 1:16 PM, Wes McKinney <wesmck...@gmail.com> wrote:

> OK, can you add instructions to the README, or we can also use the
> developer wiki as a lighter-weight option (not requiring PRs)? We need
> to be more consistently organized about developer documentation since
> the project is expanding rapidly
>
> On Fri, Jul 27, 2018 at 3:25 PM, Bryan Cutler <cutl...@gmail.com> wrote:
> > The commands to run are in the Dockerfile for right now, however, you do
> > need to use the patched version of Spark from my branch I linked in
> > ARROW-2914 because there have been Java API changes.
> >
> > On Fri, Jul 27, 2018 at 12:18 PM, Wes McKinney <wesmck...@gmail.com>
> wrote:
> >
> >> I dealt with some of the remaining unassigned issues. There are
> >> several left where the PR author does not have a JIRA account or I
> >> wasn't able to find their ID to add them to the "contributor" role.
> >>
> >> @Li, I think you can run (Bryan, is this right?)
> >>
> >> dev/run_docker_compose.sh spark_integration
> >>
> >> this needs to be documented here
> >>
> >> https://github.com/apache/arrow/tree/master/dev#integration-testing
> >>
> >> On Fri, Jul 27, 2018 at 1:12 PM, Bryan Cutler <cutl...@gmail.com>
> wrote:
> >> > Like, I ran the Spark integration after ARROW-2914 but I'll run again
> >> later
> >> > today since there might be some changes.
> >> >
> >> > On Fri, Jul 27, 2018, 7:11 AM Li Jin <ice.xell...@gmail.com> wrote:
> >> >
> >> >> @Bryan I noticed there are integration tests setup with Spark. We
> should
> >> >> probably run those before cutting the RC but I don't know how those
> are
> >> >> run.
> >> >> If you can help running those or show me how to run those it would be
> >> >> great!
> >> >>
> >>
>

Reply via email to