Have you tried adding this line?
  "javax.servlet" % "javax.servlet-api" % "3.0.1" % "provided"

This made the problem go away for me. It also works without the "provided"
scope.
ᐧ

On Wed, Jan 14, 2015 at 5:09 AM, Night Wolf <nightwolf...@gmail.com> wrote:

> Thanks for the tips!
>
> Yeah its as a working SBT project. I.e. if I do an SBT run it picks up
> Test1 as a main class and runs it for me without error. Its only in
> IntelliJ. I opened the project from the folder afresh by choosing the
> build.sbt file. I re-tested by deleting .idea and just choosing the project
> folder and I get the same result.  Not using gen-idea in sbt.
>
>
>
> On Wed, Jan 14, 2015 at 8:52 AM, Jay Vyas <jayunit100.apa...@gmail.com>
> wrote:
>
>> I find importing a working SBT project into IntelliJ is the way to
>> go.....
>>
>> How did you load the project into intellij?
>>
>> On Jan 13, 2015, at 4:45 PM, Enno Shioji <eshi...@gmail.com> wrote:
>>
>> Had the same issue. I can't remember what the issue was but this works:
>>
>> libraryDependencies ++= {
>>   val sparkVersion = "1.2.0"
>>   Seq(
>>     "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
>>     "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
>>     "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion %
>> "provided",
>>     "org.apache.spark" %% "spark-streaming-kafka" % sparkVersion %
>> "provided",
>>     "javax.servlet" % "javax.servlet-api" % "3.0.1" % "provided"
>>   )
>> }
>>
>> In order to run classes in "main" source in Intellij, you must invoke it
>> from a source under "test" as Intellij won't provide the "provided" scope
>> libraries when running code in "main" source (but it will for sources under
>> "test").
>>
>> With this config you can "sbt assembly" in order to get the fat jar
>> without Spark jars.
>>
>>
>> ᐧ
>>
>> On Tue, Jan 13, 2015 at 12:16 PM, Night Wolf <nightwolf...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I'm trying to load up an SBT project in IntelliJ 14 (windows) running
>>> 1.7 JDK, SBT 0.13.5 -I seem to be getting errors with the project.
>>>
>>> The build.sbt file is super simple;
>>>
>>> name := "scala-spark-test1"
>>> version := "1.0"
>>>
>>> scalaVersion := "2.10.4"
>>>
>>> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
>>>
>>>
>>> Then I have a super simple test class;
>>>
>>> package test
>>>
>>> import org.apache.spark.{SparkContext, SparkConf}
>>>
>>> case class Blah(s: Int, d: String)
>>>
>>> object Test1  {
>>>   def main(args: Array[String]): Unit = {
>>>     val sparkconf = new
>>> SparkConf().setMaster("local[4]").setAppName("test-spark")
>>>     val sc = new SparkContext(sparkconf)
>>>
>>>     val rdd = sc.parallelize(Seq(
>>>       Blah(1,"dsdsd"),
>>>       Blah(2,"daaa"),
>>>       Blah(3,"dhghghgh")
>>>     ))
>>>
>>>     rdd.collect().foreach(println)
>>>
>>>   }
>>> }
>>>
>>>
>>> When I try to run the Test1 object in IntelliJ I get the following error;
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> javax/servlet/http/HttpServletResponse
>>> at org.apache.spark.HttpServer.org
>>> $apache$spark$HttpServer$$doStart(HttpServer.scala:73)
>>> at org.apache.spark.HttpServer$$anonfun$1.apply(HttpServer.scala:60)
>>> at org.apache.spark.HttpServer$$anonfun$1.apply(HttpServer.scala:60)
>>> at
>>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1676)
>>> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>>> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1667)
>>> at org.apache.spark.HttpServer.start(HttpServer.scala:60)
>>> at org.apache.spark.HttpFileServer.initialize(HttpFileServer.scala:45)
>>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:304)
>>> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
>>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
>>> at test.Test1$.main(Test1.scala:10)
>>> at test.Test1.main(Test1.scala)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
>>> Caused by: java.lang.ClassNotFoundException:
>>> javax.servlet.http.HttpServletResponse
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>> ... 18 more
>>>
>>>
>>> For whatever reason it seems that IntelliJ isnt pulling in these deps.
>>> Doing an sbt run works fine. Looking at the project structure it seems that
>>> 7 libs dont get marked as a dependency for my module... But they are on the
>>> dep tree.... http://pastebin.com/REkQh5ux
>>>
>>> <image.png>
>>>
>>> Is this something to do with the libs and scoping or shading in Spark
>>> and its associated libs? Has anyone else seen this issue?
>>>
>>> Cheers,
>>> NW
>>>
>>
>>
>

Reply via email to