Re: Scala: Perform Unit Testing in spark

2016-04-06 Thread Shishir Anshuman
I placed the *tests* jars in the *lib* folder, Now its working.

On Wed, Apr 6, 2016 at 7:34 PM, Lars Albertsson  wrote:

> Hi,
>
> I wrote a longish mail on Spark testing strategy last month, which you
> may find useful:
> http://mail-archives.apache.org/mod_mbox/spark-user/201603.mbox/browser
>
> Let me know if you have follow up questions or want assistance.
>
> Regards,
>
>
> Lars Albertsson
> Data engineering consultant
> www.mapflat.com
> +46 70 7687109
>
>
> On Fri, Apr 1, 2016 at 10:31 PM, Shishir Anshuman
>  wrote:
> > Hello,
> >
> > I have a code written in scala using Mllib. I want to perform unit
> testing
> > it. I cant decide between Junit 4 and ScalaTest.
> > I am new to Spark. Please guide me how to proceed with the testing.
> >
> > Thank you.
>


Re: Scala: Perform Unit Testing in spark

2016-04-06 Thread Lars Albertsson
Hi,

I wrote a longish mail on Spark testing strategy last month, which you
may find useful:
http://mail-archives.apache.org/mod_mbox/spark-user/201603.mbox/browser

Let me know if you have follow up questions or want assistance.

Regards,


Lars Albertsson
Data engineering consultant
www.mapflat.com
+46 70 7687109


On Fri, Apr 1, 2016 at 10:31 PM, Shishir Anshuman
 wrote:
> Hello,
>
> I have a code written in scala using Mllib. I want to perform unit testing
> it. I cant decide between Junit 4 and ScalaTest.
> I am new to Spark. Please guide me how to proceed with the testing.
>
> Thank you.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Scala: Perform Unit Testing in spark

2016-04-02 Thread Ted Yu
I think you should specify dependencies in this way:

*"org.apache.spark" % "spark-core_2.10" % "1.6.0"* % "tests"

Please refer to http://www.scalatest.org/user_guide/using_scalatest_with_sbt

On Fri, Apr 1, 2016 at 3:33 PM, Shishir Anshuman 
wrote:

> When I added *"org.apache.spark" % "spark-core_2.10" % "1.6.0",  *it
> should include spark-core_2.10-1.6.1-tests.jar.
> Why do I need to use the jar file explicitly?
>
> And how do I use the jars for compiling with *sbt* and running the tests
> on spark?
>
>
> On Sat, Apr 2, 2016 at 3:46 AM, Ted Yu  wrote:
>
>> You need to include the following jars:
>>
>> jar tvf ./core/target/spark-core_2.10-1.6.1-tests.jar | grep SparkFunSuite
>>   1787 Thu Mar 03 09:06:14 PST 2016
>> org/apache/spark/SparkFunSuite$$anonfun$withFixture$1.class
>>   1780 Thu Mar 03 09:06:14 PST 2016
>> org/apache/spark/SparkFunSuite$$anonfun$withFixture$2.class
>>   3982 Thu Mar 03 09:06:14 PST 2016 org/apache/spark/SparkFunSuite.class
>>
>> jar tvf ./mllib/target/spark-mllib_2.10-1.6.1-tests.jar | grep
>> MLlibTestSparkContext
>>   1447 Thu Mar 03 09:53:54 PST 2016
>> org/apache/spark/mllib/util/MLlibTestSparkContext.class
>>   1704 Thu Mar 03 09:53:54 PST 2016
>> org/apache/spark/mllib/util/MLlibTestSparkContext$class.class
>>
>> On Fri, Apr 1, 2016 at 3:07 PM, Shishir Anshuman <
>> shishiranshu...@gmail.com> wrote:
>>
>>> I got the file ALSSuite.scala and trying to run it. I have copied the
>>> file under *src/test/scala *in my project folder. When I run *sbt test*,
>>> I get errors. I have attached the screenshot of the errors. Befor *sbt
>>> test*, I am building the package with *sbt package*.
>>>
>>> Dependencies of *simple.sbt*:
>>>




 *libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" %
 "1.6.0", "org.apache.spark" % "spark-mllib_2.10" % "1.6.0" )*
>>>
>>>
>>>
>>>
>>> On Sat, Apr 2, 2016 at 2:21 AM, Ted Yu  wrote:
>>>
 Assuming your code is written in Scala, I would suggest using
 ScalaTest.

 Please take a look at the XXSuite.scala files under mllib/

 On Fri, Apr 1, 2016 at 1:31 PM, Shishir Anshuman <
 shishiranshu...@gmail.com> wrote:

> Hello,
>
> I have a code written in scala using Mllib. I want to perform unit
> testing it. I cant decide between Junit 4 and ScalaTest.
> I am new to Spark. Please guide me how to proceed with the testing.
>
> Thank you.
>


>>>
>>
>


Re: Scala: Perform Unit Testing in spark

2016-04-01 Thread Shishir Anshuman
When I added *"org.apache.spark" % "spark-core_2.10" % "1.6.0",  *it should
include spark-core_2.10-1.6.1-tests.jar.
Why do I need to use the jar file explicitly?

And how do I use the jars for compiling with *sbt* and running the tests on
spark?


On Sat, Apr 2, 2016 at 3:46 AM, Ted Yu  wrote:

> You need to include the following jars:
>
> jar tvf ./core/target/spark-core_2.10-1.6.1-tests.jar | grep SparkFunSuite
>   1787 Thu Mar 03 09:06:14 PST 2016
> org/apache/spark/SparkFunSuite$$anonfun$withFixture$1.class
>   1780 Thu Mar 03 09:06:14 PST 2016
> org/apache/spark/SparkFunSuite$$anonfun$withFixture$2.class
>   3982 Thu Mar 03 09:06:14 PST 2016 org/apache/spark/SparkFunSuite.class
>
> jar tvf ./mllib/target/spark-mllib_2.10-1.6.1-tests.jar | grep
> MLlibTestSparkContext
>   1447 Thu Mar 03 09:53:54 PST 2016
> org/apache/spark/mllib/util/MLlibTestSparkContext.class
>   1704 Thu Mar 03 09:53:54 PST 2016
> org/apache/spark/mllib/util/MLlibTestSparkContext$class.class
>
> On Fri, Apr 1, 2016 at 3:07 PM, Shishir Anshuman <
> shishiranshu...@gmail.com> wrote:
>
>> I got the file ALSSuite.scala and trying to run it. I have copied the
>> file under *src/test/scala *in my project folder. When I run *sbt test*,
>> I get errors. I have attached the screenshot of the errors. Befor *sbt
>> test*, I am building the package with *sbt package*.
>>
>> Dependencies of *simple.sbt*:
>>
>>>
>>>
>>>
>>>
>>> *libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" %
>>> "1.6.0", "org.apache.spark" % "spark-mllib_2.10" % "1.6.0" )*
>>
>>
>>
>>
>> On Sat, Apr 2, 2016 at 2:21 AM, Ted Yu  wrote:
>>
>>> Assuming your code is written in Scala, I would suggest using ScalaTest.
>>>
>>> Please take a look at the XXSuite.scala files under mllib/
>>>
>>> On Fri, Apr 1, 2016 at 1:31 PM, Shishir Anshuman <
>>> shishiranshu...@gmail.com> wrote:
>>>
 Hello,

 I have a code written in scala using Mllib. I want to perform unit
 testing it. I cant decide between Junit 4 and ScalaTest.
 I am new to Spark. Please guide me how to proceed with the testing.

 Thank you.

>>>
>>>
>>
>


Re: Scala: Perform Unit Testing in spark

2016-04-01 Thread Ted Yu
You need to include the following jars:

jar tvf ./core/target/spark-core_2.10-1.6.1-tests.jar | grep SparkFunSuite
  1787 Thu Mar 03 09:06:14 PST 2016
org/apache/spark/SparkFunSuite$$anonfun$withFixture$1.class
  1780 Thu Mar 03 09:06:14 PST 2016
org/apache/spark/SparkFunSuite$$anonfun$withFixture$2.class
  3982 Thu Mar 03 09:06:14 PST 2016 org/apache/spark/SparkFunSuite.class

jar tvf ./mllib/target/spark-mllib_2.10-1.6.1-tests.jar | grep
MLlibTestSparkContext
  1447 Thu Mar 03 09:53:54 PST 2016
org/apache/spark/mllib/util/MLlibTestSparkContext.class
  1704 Thu Mar 03 09:53:54 PST 2016
org/apache/spark/mllib/util/MLlibTestSparkContext$class.class

On Fri, Apr 1, 2016 at 3:07 PM, Shishir Anshuman 
wrote:

> I got the file ALSSuite.scala and trying to run it. I have copied the file
> under *src/test/scala *in my project folder. When I run *sbt test*, I get
> errors. I have attached the screenshot of the errors. Befor *sbt test*, I
> am building the package with *sbt package*.
>
> Dependencies of *simple.sbt*:
>
>>
>>
>>
>>
>> *libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" %
>> "1.6.0", "org.apache.spark" % "spark-mllib_2.10" % "1.6.0" )*
>
>
>
>
> On Sat, Apr 2, 2016 at 2:21 AM, Ted Yu  wrote:
>
>> Assuming your code is written in Scala, I would suggest using ScalaTest.
>>
>> Please take a look at the XXSuite.scala files under mllib/
>>
>> On Fri, Apr 1, 2016 at 1:31 PM, Shishir Anshuman <
>> shishiranshu...@gmail.com> wrote:
>>
>>> Hello,
>>>
>>> I have a code written in scala using Mllib. I want to perform unit
>>> testing it. I cant decide between Junit 4 and ScalaTest.
>>> I am new to Spark. Please guide me how to proceed with the testing.
>>>
>>> Thank you.
>>>
>>
>>
>


Re: Scala: Perform Unit Testing in spark

2016-04-01 Thread Holden Karau
You can also look at spark-testing-base which works in both Scalatest and
Junit and see if that works for your use case.

On Friday, April 1, 2016, Ted Yu  wrote:

> Assuming your code is written in Scala, I would suggest using ScalaTest.
>
> Please take a look at the XXSuite.scala files under mllib/
>
> On Fri, Apr 1, 2016 at 1:31 PM, Shishir Anshuman <
> shishiranshu...@gmail.com
> > wrote:
>
>> Hello,
>>
>> I have a code written in scala using Mllib. I want to perform unit
>> testing it. I cant decide between Junit 4 and ScalaTest.
>> I am new to Spark. Please guide me how to proceed with the testing.
>>
>> Thank you.
>>
>
>

-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau


Re: Scala: Perform Unit Testing in spark

2016-04-01 Thread Ted Yu
Assuming your code is written in Scala, I would suggest using ScalaTest.

Please take a look at the XXSuite.scala files under mllib/

On Fri, Apr 1, 2016 at 1:31 PM, Shishir Anshuman 
wrote:

> Hello,
>
> I have a code written in scala using Mllib. I want to perform unit testing
> it. I cant decide between Junit 4 and ScalaTest.
> I am new to Spark. Please guide me how to proceed with the testing.
>
> Thank you.
>


Scala: Perform Unit Testing in spark

2016-04-01 Thread Shishir Anshuman
Hello,

I have a code written in scala using Mllib. I want to perform unit testing
it. I cant decide between Junit 4 and ScalaTest.
I am new to Spark. Please guide me how to proceed with the testing.

Thank you.