On Tue, Aug 26, 2014 at 2:02 AM, Patrick Wendell <pwend...@gmail.com> wrote:

> I'd prefer if we took the approach of politely explaining why in the
> current form the patch isn't acceptable and closing it (potentially w/ tips
> on how to improve it or narrow the scope).


Amen to this. Aiming for such a culture would set Spark apart from other
projects in a great way.

I've proposed several different solutions to ASF infra to streamline the
> process, but thus far they haven't been open to any of my ideas:


I've added myself as a watcher on those 2 INFRA issues. Sucks that the only
solution on offer right now requires basically polluting the commit history.

Short of moving Spark's repo to a non-ASF-managed GitHub account, do you
think another bot could help us manage the number of stale PRs?

I'm thinking a solution as follows might be very helpful:

   - Extend Spark QA / Jenkins to run on a weekly schedule and check for
   stale PRs. Let's say a stale PR is an open one that hasn't been updated in
   N months.
   - Spark QA maintains a list of known committers on its side.
   - During its weekly check of stale PRs, Spark QA takes the following
   action:
      - If the last person to comment on a PR was a committer, post to the
      PR asking for an update from the contributor.
      - If the last person to comment on a PR was a contributor, add the PR
      to a list. Email this list of *hanging PRs* out to the dev list on a
      weekly basis and ask committers to update them.
      - If the last person to comment on a PR was Spark QA asking the
      contributor to update it, then add the PR to a list. Email this
list of *abandoned
      PRs* to the dev list for the record (or for closing, if that becomes
      possible in the future).

This doesn't solve the problem of not being able to close PRs, but it does
help make sure no PR is left hanging for long.

What do you think? I'd be interested in implementing this solution if we
like it.

Nick

Reply via email to