Re: Unit Test for Spark Streaming

2014-08-08 Thread JiajiaJing
Hi TD, I tried some different setup on maven these days, and now I can at least get something when running mvn test. However, it seems like scalatest cannot find the test cases specified in the test suite. Here is the output I get:

Re: Unit Test for Spark Streaming

2014-08-06 Thread JiajiaJing
Thank you TD, I have worked around that problem and now the test compiles. However, I don't actually see that test running. As when I do mvn test, it just says BUILD SUCCESS, without any TEST section on stdout. Are we suppose to use mvn test to run the test? Are there any other methods can be

Re: Unit Test for Spark Streaming

2014-08-06 Thread Tathagata Das
Does it not show the name of the testsuite on stdout, showing that it has passed? Can you try writing a small test unit-test, in the same way as your kafka unit test, and with print statements on stdout ... to see whether it works? I believe it is some configuration issue in maven, which is hard

Re: Unit Test for Spark Streaming

2014-08-05 Thread Tathagata Das
That function is simply deletes a directory recursively. you can use alternative libraries. see this discussion http://stackoverflow.com/questions/779519/delete-files-recursively-in-java On Tue, Aug 5, 2014 at 5:02 PM, JiajiaJing jj.jing0...@gmail.com wrote: Hi TD, I encountered a problem

Re: Unit Test for Spark Streaming

2014-08-04 Thread Tathagata Das
Appropriately timed question! Here is the PR that adds a real unit test for Kafka stream in Spark Streaming. Maybe this will help! https://github.com/apache/spark/pull/1751/files On Mon, Aug 4, 2014 at 6:30 PM, JiajiaJing jj.jing0...@gmail.com wrote: Hello Spark Users, I have a spark

Re: Unit Test for Spark Streaming

2014-08-04 Thread JiajiaJing
This helps a lot!! Thank you very much! Jiajia -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11396.html Sent from the Apache Spark User List mailing list archive at Nabble.com.