Re: [pyspark] What is the best way to run a minimum unit testing related to our developing module?

2015-07-01 Thread Reynold Xin
Run

./python/run-tests --help

and you will see. :)

On Wed, Jul 1, 2015 at 9:10 PM, Yu Ishikawa yuu.ishikawa+sp...@gmail.com
wrote:

 Hi all,

 When I develop pyspark modules, such as adding a spark.ml API in Python,
 I'd
 like to run a minimum unit testing related to the developing module again
 and again.
 In the previous version, that was easy with commenting out unrelated
 modules
 in the ./python/run-tests script. So what is the best way to run a minimum
 unit testing related to our developing modules under the current version?
 Of course, I think it would be nice to be able to identify testing targets
 with the script like scala's sbt.

 Thanks,
 Yu



 -
 -- Yu Ishikawa
 --
 View this message in context:
 http://apache-spark-developers-list.1001551.n3.nabble.com/pyspark-What-is-the-best-way-to-run-a-minimum-unit-testing-related-to-our-developing-module-tp12987.html
 Sent from the Apache Spark Developers List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org




Re: [pyspark] What is the best way to run a minimum unit testing related to our developing module?

2015-07-01 Thread Yu Ishikawa
Thanks! --Yu



-
-- Yu Ishikawa
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/pyspark-What-is-the-best-way-to-run-a-minimum-unit-testing-related-to-our-developing-module-tp12987p12989.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [pyspark] What is the best way to run a minimum unit testing related to our developing module?

2015-07-01 Thread Yu ISHIKAWA
Thanks!  --Yu

2015-07-02 13:13 GMT+09:00 Reynold Xin r...@databricks.com:

 Run

 ./python/run-tests --help

 and you will see. :)

 On Wed, Jul 1, 2015 at 9:10 PM, Yu Ishikawa yuu.ishikawa+sp...@gmail.com
 wrote:

 Hi all,

 When I develop pyspark modules, such as adding a spark.ml API in Python,
 I'd
 like to run a minimum unit testing related to the developing module again
 and again.
 In the previous version, that was easy with commenting out unrelated
 modules
 in the ./python/run-tests script. So what is the best way to run a minimum
 unit testing related to our developing modules under the current version?
 Of course, I think it would be nice to be able to identify testing targets
 with the script like scala's sbt.

 Thanks,
 Yu



 -
 -- Yu Ishikawa
 --
 View this message in context:
 http://apache-spark-developers-list.1001551.n3.nabble.com/pyspark-What-is-the-best-way-to-run-a-minimum-unit-testing-related-to-our-developing-module-tp12987.html
 Sent from the Apache Spark Developers List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org





[pyspark] What is the best way to run a minimum unit testing related to our developing module?

2015-07-01 Thread Yu Ishikawa
Hi all,

When I develop pyspark modules, such as adding a spark.ml API in Python, I'd
like to run a minimum unit testing related to the developing module again
and again. 
In the previous version, that was easy with commenting out unrelated modules
in the ./python/run-tests script. So what is the best way to run a minimum
unit testing related to our developing modules under the current version?
Of course, I think it would be nice to be able to identify testing targets
with the script like scala's sbt.

Thanks,
Yu



-
-- Yu Ishikawa
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/pyspark-What-is-the-best-way-to-run-a-minimum-unit-testing-related-to-our-developing-module-tp12987.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org