When I added *"org.apache.spark" % "spark-core_2.10" % "1.6.0",  *it should
include spark-core_2.10-1.6.1-tests.jar.
Why do I need to use the jar file explicitly?

And how do I use the jars for compiling with *sbt* and running the tests on
spark?


On Sat, Apr 2, 2016 at 3:46 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> You need to include the following jars:
>
> jar tvf ./core/target/spark-core_2.10-1.6.1-tests.jar | grep SparkFunSuite
>   1787 Thu Mar 03 09:06:14 PST 2016
> org/apache/spark/SparkFunSuite$$anonfun$withFixture$1.class
>   1780 Thu Mar 03 09:06:14 PST 2016
> org/apache/spark/SparkFunSuite$$anonfun$withFixture$2.class
>   3982 Thu Mar 03 09:06:14 PST 2016 org/apache/spark/SparkFunSuite.class
>
> jar tvf ./mllib/target/spark-mllib_2.10-1.6.1-tests.jar | grep
> MLlibTestSparkContext
>   1447 Thu Mar 03 09:53:54 PST 2016
> org/apache/spark/mllib/util/MLlibTestSparkContext.class
>   1704 Thu Mar 03 09:53:54 PST 2016
> org/apache/spark/mllib/util/MLlibTestSparkContext$class.class
>
> On Fri, Apr 1, 2016 at 3:07 PM, Shishir Anshuman <
> shishiranshu...@gmail.com> wrote:
>
>> I got the file ALSSuite.scala and trying to run it. I have copied the
>> file under *src/test/scala *in my project folder. When I run *sbt test*,
>> I get errors. I have attached the screenshot of the errors. Befor *sbt
>> test*, I am building the package with *sbt package*.
>>
>> Dependencies of *simple.sbt*:
>>
>>>
>>>
>>>
>>>
>>> *libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.10" %
>>> "1.6.0", "org.apache.spark" % "spark-mllib_2.10" % "1.6.0" )*
>>
>>
>>
>>
>> On Sat, Apr 2, 2016 at 2:21 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> Assuming your code is written in Scala, I would suggest using ScalaTest.
>>>
>>> Please take a look at the XXSuite.scala files under mllib/
>>>
>>> On Fri, Apr 1, 2016 at 1:31 PM, Shishir Anshuman <
>>> shishiranshu...@gmail.com> wrote:
>>>
>>>> Hello,
>>>>
>>>> I have a code written in scala using Mllib. I want to perform unit
>>>> testing it. I cant decide between Junit 4 and ScalaTest.
>>>> I am new to Spark. Please guide me how to proceed with the testing.
>>>>
>>>> Thank you.
>>>>
>>>
>>>
>>
>

Reply via email to