That works! Thank you.

On Tue, Jul 22, 2014 at 12:28 AM, Reynold Xin <r...@databricks.com> wrote:

> I missed that bullet point. I removed that and just pointed it towards the
> instruction.
>
>
> On Mon, Jul 21, 2014 at 9:20 PM, Nicholas Chammas <
> nicholas.cham...@gmail.com> wrote:
>
> > Looks good. Does sbt/sbt test cover the same tests as /dev/run-tests?
> >
> > I’m looking at step 5 under “Contributing Code”. Someone contributing to
> > PySpark will want to be directed to run something in addition to (or
> > instead of) sbt/sbt test, I believe.
> >
> > Nick
> > ​
> >
> >
> > On Mon, Jul 21, 2014 at 11:43 PM, Reynold Xin <r...@databricks.com>
> wrote:
> >
> > > I added an automated testing section:
> > >
> > >
> >
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-AutomatedTesting
> > >
> > > Can you take a look to see if it is what you had in mind?
> > >
> > >
> > >
> > > On Mon, Jul 21, 2014 at 3:54 PM, Nicholas Chammas <
> > > nicholas.cham...@gmail.com> wrote:
> > >
> > > > For the record, the triggering discussion is here
> > > > <https://github.com/apache/spark/pull/1505#issuecomment-49671550>. I
> > > > assumed that sbt/sbt test covers all the tests required before
> > > submitting a
> > > > patch, and it appears that it doesn’t.
> > > > ​
> > > >
> > > >
> > > > On Mon, Jul 21, 2014 at 6:42 PM, Nicholas Chammas <
> > > > nicholas.cham...@gmail.com> wrote:
> > > >
> > > > > Contributing to Spark
> > > > > <
> > >
> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
> > > > >
> > > > > needs a line or two about building and testing PySpark. A call out
> of
> > > > > run-tests, for example, would be helpful for new contributors to
> > > PySpark.
> > > > >
> > > > > Nick
> > > > > ​
> > > > >
> > > >
> > >
> >
>

Reply via email to