Hi,

I just called:

> test

or

> run

Thorsten


Am 10.09.2014 um 13:38 schrieb arthur.hk.c...@gmail.com:
Hi,

What is your SBT command and the parameters?

Arthur


On 10 Sep, 2014, at 6:46 pm, Thorsten Bergler <sp...@tbonline.de> wrote:

Hello,

I am writing a Spark App which is already working so far.
Now I started to build also some UnitTests, but I am running into some 
dependecy problems and I cannot find a solution right now. Perhaps someone 
could help me.

I build my Spark Project with SBT and it seems to be configured well, because 
compiling, assembling and running the built jar with spark-submit are working 
well.

Now I started with the UnitTests, which I located under /src/test/scala.

When I call "test" in sbt, I get the following:

14/09/10 12:22:06 INFO storage.BlockManagerMaster: Registered BlockManager
14/09/10 12:22:06 INFO spark.HttpServer: Starting HTTP Server
[trace] Stack trace suppressed: run last test:test for the full output.
[error] Could not run test test.scala.SetSuite: java.lang.NoClassDefFoundError: 
javax/servlet/http/HttpServletResponse
[info] Run completed in 626 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[error] Error during tests:
[error]     test.scala.SetSuite
[error] (test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 3 s, completed 10.09.2014 12:22:06

last test:test gives me the following:

last test:test
[debug] Running TaskDef(test.scala.SetSuite, 
org.scalatest.tools.Framework$$anon$1@6e5626c8, false, [SuiteSelector])
java.lang.NoClassDefFoundError: javax/servlet/http/HttpServletResponse
    at org.apache.spark.HttpServer.start(HttpServer.scala:54)
    at 
org.apache.spark.broadcast.HttpBroadcast$.createServer(HttpBroadcast.scala:156)
    at 
org.apache.spark.broadcast.HttpBroadcast$.initialize(HttpBroadcast.scala:127)
    at 
org.apache.spark.broadcast.HttpBroadcastFactory.initialize(HttpBroadcastFactory.scala:31)
    at 
org.apache.spark.broadcast.BroadcastManager.initialize(BroadcastManager.scala:48)
    at 
org.apache.spark.broadcast.BroadcastManager.<init>(BroadcastManager.scala:35)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
    at test.scala.SetSuite.<init>(SparkTest.scala:16)

I also noticed right now, that sbt run is also not working:

14/09/10 12:44:46 INFO spark.HttpServer: Starting HTTP Server
[error] (run-main-2) java.lang.NoClassDefFoundError: 
javax/servlet/http/HttpServletResponse
java.lang.NoClassDefFoundError: javax/servlet/http/HttpServletResponse
    at org.apache.spark.HttpServer.start(HttpServer.scala:54)
    at 
org.apache.spark.broadcast.HttpBroadcast$.createServer(HttpBroadcast.scala:156)
    at 
org.apache.spark.broadcast.HttpBroadcast$.initialize(HttpBroadcast.scala:127)
    at 
org.apache.spark.broadcast.HttpBroadcastFactory.initialize(HttpBroadcastFactory.scala:31)
    at 
org.apache.spark.broadcast.BroadcastManager.initialize(BroadcastManager.scala:48)
    at 
org.apache.spark.broadcast.BroadcastManager.<init>(BroadcastManager.scala:35)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
    at 
main.scala.PartialDuplicateScanner$.main(PartialDuplicateScanner.scala:29)
    at main.scala.PartialDuplicateScanner.main(PartialDuplicateScanner.scala)

Here is my Testprojekt.sbt file:

name := "Testprojekt"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies ++= {
  Seq(
    "org.apache.lucene" % "lucene-core" % "4.9.0",
    "org.apache.lucene" % "lucene-analyzers-common" % "4.9.0",
    "org.apache.lucene" % "lucene-queryparser" % "4.9.0",
    ("org.apache.spark" %% "spark-core" % "1.0.2").
        exclude("org.mortbay.jetty", "servlet-api").
        exclude("commons-beanutils", "commons-beanutils-core").
        exclude("commons-collections", "commons-collections").
        exclude("commons-collections", "commons-collections").
        exclude("com.esotericsoftware.minlog", "minlog").
        exclude("org.eclipse.jetty.orbit", "javax.mail.glassfish").
        exclude("org.eclipse.jetty.orbit", "javax.transaction").
        exclude("org.eclipse.jetty.orbit", "javax.servlet")
  )
}

resolvers += "Akka Repository" at "http://repo.akka.io/releases/";







---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to