Hi all,

When I develop pyspark modules, such as adding a spark.ml API in Python, I'd
like to run a minimum unit testing related to the developing module again
and again. 
In the previous version, that was easy with commenting out unrelated modules
in the ./python/run-tests script. So what is the best way to run a minimum
unit testing related to our developing modules under the current version?
Of course, I think it would be nice to be able to identify testing targets
with the script like scala's sbt.

Thanks,
Yu



-----
-- Yu Ishikawa
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/pyspark-What-is-the-best-way-to-run-a-minimum-unit-testing-related-to-our-developing-module-tp12987.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to