Re: Unit Test for Spark Streaming

2014-08-08 Thread JiajiaJing
Hi TD,

I tried some different setup on maven these days, and now I can at least get
something when running "mvn test". However, it seems like scalatest cannot
find the test cases specified in the test suite.
Here is the output I get: 

 

Could you please give me some details on how you setup the ScalaTest on your
machine? I believe there must be some other setup issue on my machine but I
cannot figure out why...
And here are the plugins and dependencies related to scalatest in my pom.xml
:

  org.apache.maven.plugins
  maven-surefire-plugin
  2.7
  
true
  



  org.scalatest
  scalatest-maven-plugin
  1.0
  
   
${project.build.directory}/surefire-reports
.
   
${project.build.directory}/SparkTestSuite.txt
ATag

  true
 
${session.executionRootDirectory}
  1

  
  

  test
  
test
  

  




  junit
  junit
  4.8.1
  test


  org.scalatest
  scalatest_2.10
  2.2.1
  test


Thank you very much!

Best Regards,

Jiajia



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11825.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Unit Test for Spark Streaming

2014-08-06 Thread Tathagata Das
Does it not show the name of the testsuite on stdout, showing that it has
passed? Can you try writing a small "test" unit-test, in the same way as
your kafka unit test, and with print statements on stdout ... to see
whether it works? I believe it is some configuration issue in maven, which
is hard for me to guess.

TD


On Wed, Aug 6, 2014 at 12:53 PM, JiajiaJing  wrote:

> Thank you TD,
>
> I have worked around that problem and now the test compiles.
> However, I don't actually see that test running. As when I do "mvn test",
> it
> just says "BUILD SUCCESS", without any TEST section on stdout.
> Are we suppose to use "mvn test" to run the test? Are there any other
> methods can be used to run this test?
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11570.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Unit Test for Spark Streaming

2014-08-06 Thread JiajiaJing
Thank you TD,

I have worked around that problem and now the test compiles. 
However, I don't actually see that test running. As when I do "mvn test", it
just says "BUILD SUCCESS", without any TEST section on stdout. 
Are we suppose to use "mvn test" to run the test? Are there any other
methods can be used to run this test?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11570.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Unit Test for Spark Streaming

2014-08-05 Thread Tathagata Das
That function is simply deletes a directory recursively. you can use
alternative libraries. see this discussion
http://stackoverflow.com/questions/779519/delete-files-recursively-in-java


On Tue, Aug 5, 2014 at 5:02 PM, JiajiaJing  wrote:
> Hi TD,
>
> I encountered a problem when trying to run the KafkaStreamSuite.scala unit
> test.
> I added "scalatest-maven-plugin" to my pom.xml, then ran "mvn test", and got
> the follow error message:
>
> error: object Utils in package util cannot be accessed in package
> org.apache.spark.util
> [INFO] brokerConf.logDirs.foreach { f =>
> Utils.deleteRecursively(new File(f)) }
> [INFO]^
>
> I checked that Utils.scala does exists under
> "spark/core/src/main/scala/org/apache/spark/util/", so I have no idea about
> why this access error.
> Could you please help me with this?
>
> Thank you very much!
>
> Best Regards,
>
> Jiajia
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11505.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Unit Test for Spark Streaming

2014-08-05 Thread JiajiaJing
Hi TD,

I encountered a problem when trying to run the KafkaStreamSuite.scala unit
test.
I added "scalatest-maven-plugin" to my pom.xml, then ran "mvn test", and got
the follow error message: 

error: object Utils in package util cannot be accessed in package
org.apache.spark.util
[INFO] brokerConf.logDirs.foreach { f =>
Utils.deleteRecursively(new File(f)) }
[INFO]^

I checked that Utils.scala does exists under
"spark/core/src/main/scala/org/apache/spark/util/", so I have no idea about
why this access error.
Could you please help me with this?

Thank you very much!

Best Regards,

Jiajia



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11505.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Unit Test for Spark Streaming

2014-08-04 Thread JiajiaJing
This helps a lot!!
Thank you very much!

Jiajia



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11396.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Unit Test for Spark Streaming

2014-08-04 Thread Tathagata Das
Appropriately timed question! Here is the PR that adds a real unit
test for Kafka stream in Spark Streaming. Maybe this will help!

https://github.com/apache/spark/pull/1751/files

On Mon, Aug 4, 2014 at 6:30 PM, JiajiaJing  wrote:
> Hello Spark Users,
>
> I have a spark streaming program that stream data from kafka topics and
> output as parquet file on HDFS.
> Now I want to write a unit test for this program to make sure the output
> data is correct (i.e not missing any data from kafka).
> However, I have no idea about how to do this, especially how to mock a kafka
> topic.
> Can someone help me with this?
>
> Thank you very much!
>
> Best Regards,
>
> Jiajia
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org