Hello,

When I remove the line and try to execute "sbt run", I end up with the following lines:


14/09/15 10:11:35 INFO ui.SparkUI: Stopped Spark web UI at http://base:4040
[...]
14/09/15 10:11:05 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 14/09/15 10:11:15 INFO client.AppClient$ClientActor: Connecting to master spark://base:7077...

It seems that the configuration within sbt doesn't use my original Spark, because my original Spark web UI is running under http://base:8080. Seems like sbt is starting another spark instance??

Best regards
Thorsten


Am 14.09.2014 um 18:56 schrieb Dean Wampler:
Sorry, I meant any *other* SBT files.

However, what happens if you remove the line:

      exclude("org.eclipse.jetty.orbit", "javax.servlet")


dean


Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Sun, Sep 14, 2014 at 11:53 AM, Dean Wampler <deanwamp...@gmail.com <mailto:deanwamp...@gmail.com>> wrote:

    Can you post your whole SBT build file(s)?

    Dean Wampler, Ph.D.
    Author: Programming Scala, 2nd Edition
    <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
    Typesafe <http://typesafe.com>
    @deanwampler <http://twitter.com/deanwampler>
    http://polyglotprogramming.com

    On Wed, Sep 10, 2014 at 6:48 AM, Thorsten Bergler
    <sp...@tbonline.de <mailto:sp...@tbonline.de>> wrote:

        Hi,

        I just called:

        > test

        or

        > run

        Thorsten


        Am 10.09.2014 um 13:38 schrieb arthur.hk.c...@gmail.com
        <mailto:arthur.hk.c...@gmail.com>:

            Hi,

            What is your SBT command and the parameters?

            Arthur


            On 10 Sep, 2014, at 6:46 pm, Thorsten Bergler
            <sp...@tbonline.de <mailto:sp...@tbonline.de>> wrote:

                Hello,

                I am writing a Spark App which is already working so far.
                Now I started to build also some UnitTests, but I am
                running into some dependecy problems and I cannot find
                a solution right now. Perhaps someone could help me.

                I build my Spark Project with SBT and it seems to be
                configured well, because compiling, assembling and
                running the built jar with spark-submit are working well.

                Now I started with the UnitTests, which I located
                under /src/test/scala.

                When I call "test" in sbt, I get the following:

                14/09/10 12:22:06 INFO storage.BlockManagerMaster:
                Registered BlockManager
                14/09/10 12:22:06 INFO spark.HttpServer: Starting HTTP
                Server
                [trace] Stack trace suppressed: run last test:test for
                the full output.
                [error] Could not run test test.scala.SetSuite:
                java.lang.NoClassDefFoundError:
                javax/servlet/http/HttpServletResponse
                [info] Run completed in 626 milliseconds.
                [info] Total number of tests run: 0
                [info] Suites: completed 0, aborted 0
                [info] Tests: succeeded 0, failed 0, canceled 0,
                ignored 0, pending 0
                [info] All tests passed.
                [error] Error during tests:
                [error]     test.scala.SetSuite
                [error] (test:test) sbt.TestsFailedException: Tests
                unsuccessful
                [error] Total time: 3 s, completed 10.09.2014 12:22:06

                last test:test gives me the following:

                    last test:test

                [debug] Running TaskDef(test.scala.SetSuite,
                org.scalatest.tools.Framework$$anon$1@6e5626c8, false,
                [SuiteSelector])
                java.lang.NoClassDefFoundError:
                javax/servlet/http/HttpServletResponse
                    at
                org.apache.spark.HttpServer.start(HttpServer.scala:54)
                    at
                
org.apache.spark.broadcast.HttpBroadcast$.createServer(HttpBroadcast.scala:156)
                    at
                
org.apache.spark.broadcast.HttpBroadcast$.initialize(HttpBroadcast.scala:127)
                    at
                
org.apache.spark.broadcast.HttpBroadcastFactory.initialize(HttpBroadcastFactory.scala:31)
                    at
                
org.apache.spark.broadcast.BroadcastManager.initialize(BroadcastManager.scala:48)
                    at
                
org.apache.spark.broadcast.BroadcastManager.<init>(BroadcastManager.scala:35)
                    at
                org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
                    at
                org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
                    at test.scala.SetSuite.<init>(SparkTest.scala:16)

                I also noticed right now, that sbt run is also not
                working:

                14/09/10 12:44:46 INFO spark.HttpServer: Starting HTTP
                Server
                [error] (run-main-2) java.lang.NoClassDefFoundError:
                javax/servlet/http/HttpServletResponse
                java.lang.NoClassDefFoundError:
                javax/servlet/http/HttpServletResponse
                    at
                org.apache.spark.HttpServer.start(HttpServer.scala:54)
                    at
                
org.apache.spark.broadcast.HttpBroadcast$.createServer(HttpBroadcast.scala:156)
                    at
                
org.apache.spark.broadcast.HttpBroadcast$.initialize(HttpBroadcast.scala:127)
                    at
                
org.apache.spark.broadcast.HttpBroadcastFactory.initialize(HttpBroadcastFactory.scala:31)
                    at
                
org.apache.spark.broadcast.BroadcastManager.initialize(BroadcastManager.scala:48)
                    at
                
org.apache.spark.broadcast.BroadcastManager.<init>(BroadcastManager.scala:35)
                    at
                org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
                    at
                org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
                    at
                
main.scala.PartialDuplicateScanner$.main(PartialDuplicateScanner.scala:29)
                    at
                
main.scala.PartialDuplicateScanner.main(PartialDuplicateScanner.scala)

                Here is my Testprojekt.sbt file:

                name := "Testprojekt"

                version := "1.0"

                scalaVersion := "2.10.4"

                libraryDependencies ++= {
                  Seq(
                    "org.apache.lucene" % "lucene-core" % "4.9.0",
                    "org.apache.lucene" % "lucene-analyzers-common" %
                "4.9.0",
                    "org.apache.lucene" % "lucene-queryparser" % "4.9.0",
                    ("org.apache.spark" %% "spark-core" % "1.0.2").
                        exclude("org.mortbay.jetty", "servlet-api").
                        exclude("commons-beanutils",
                "commons-beanutils-core").
                        exclude("commons-collections",
                "commons-collections").
                        exclude("commons-collections",
                "commons-collections").
                        exclude("com.esotericsoftware.minlog", "minlog").
                        exclude("org.eclipse.jetty.orbit",
                "javax.mail.glassfish").
                        exclude("org.eclipse.jetty.orbit",
                "javax.transaction").
                        exclude("org.eclipse.jetty.orbit",
                "javax.servlet")
                  )
                }

                resolvers += "Akka Repository" at
                "http://repo.akka.io/releases/";







                
---------------------------------------------------------------------
                To unsubscribe, e-mail:
                user-unsubscr...@spark.apache.org
                <mailto:user-unsubscr...@spark.apache.org>
                For additional commands, e-mail:
                user-h...@spark.apache.org
                <mailto:user-h...@spark.apache.org>


            
---------------------------------------------------------------------
            To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
            <mailto:user-unsubscr...@spark.apache.org>
            For additional commands, e-mail:
            user-h...@spark.apache.org <mailto:user-h...@spark.apache.org>



        ---------------------------------------------------------------------
        To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
        <mailto:user-unsubscr...@spark.apache.org>
        For additional commands, e-mail: user-h...@spark.apache.org
        <mailto:user-h...@spark.apache.org>




Reply via email to