Try cleaning your maven (.m2) and ivy cache. 


> On May 23, 2014, at 12:03 AM, Shrikar archak <shrika...@gmail.com> wrote:
> 
> Yes I did a sbt publish-local. Ok I will try with Spark 0.9.1.
> 
> Thanks,
> Shrikar
> 
> 
>> On Thu, May 22, 2014 at 8:53 PM, Tathagata Das <tathagata.das1...@gmail.com> 
>> wrote:
>> How are you getting Spark with 1.0.0-SNAPSHOT through maven? Did you publish 
>> Spark locally which allowed you to use it as a dependency?
>> 
>> This is a weird indeed. SBT should take care of all the dependencies of 
>> spark.
>> 
>> In any case, you can try the last released Spark 0.9.1 and see if the 
>> problem persists.
>> 
>> 
>>> On Thu, May 22, 2014 at 3:59 PM, Shrikar archak <shrika...@gmail.com> wrote:
>>> I am running as sbt run. I am running it locally .
>>> 
>>> Thanks,
>>> Shrikar
>>> 
>>> 
>>>> On Thu, May 22, 2014 at 3:53 PM, Tathagata Das 
>>>> <tathagata.das1...@gmail.com> wrote:
>>>> How are you launching the application? sbt run ? spark-submit? local
>>>> mode or Spark standalone cluster? Are you packaging all your code into
>>>> a jar?
>>>> Looks to me that you seem to have spark classes in your execution
>>>> environment but missing some of Spark's dependencies.
>>>> 
>>>> TD
>>>> 
>>>> 
>>>> 
>>>> On Thu, May 22, 2014 at 2:27 PM, Shrikar archak <shrika...@gmail.com> 
>>>> wrote:
>>>> > Hi All,
>>>> >
>>>> > I am trying to run the network count example as a seperate standalone job
>>>> > and running into some issues.
>>>> >
>>>> > Environment:
>>>> > 1) Mac Mavericks
>>>> > 2) Latest spark repo from Github.
>>>> >
>>>> >
>>>> > I have a structure like this
>>>> >
>>>> > Shrikars-MacBook-Pro:SimpleJob shrikar$ find .
>>>> > .
>>>> > ./simple.sbt
>>>> > ./src
>>>> > ./src/main
>>>> > ./src/main/scala
>>>> > ./src/main/scala/NetworkWordCount.scala
>>>> > ./src/main/scala/SimpleApp.scala.bk
>>>> >
>>>> >
>>>> > simple.sbt
>>>> > name := "Simple Project"
>>>> >
>>>> > version := "1.0"
>>>> >
>>>> > scalaVersion := "2.10.3"
>>>> >
>>>> > libraryDependencies ++= Seq("org.apache.spark" %% "spark-core" %
>>>> > "1.0.0-SNAPSHOT",
>>>> >                             "org.apache.spark" %% "spark-streaming" %
>>>> > "1.0.0-SNAPSHOT")
>>>> >
>>>> > resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
>>>> >
>>>> >
>>>> > I am able to run the SimpleApp which is mentioned in the doc but when I 
>>>> > try
>>>> > to run the NetworkWordCount app I get error like this am I missing
>>>> > something?
>>>> >
>>>> > [info] Running com.shrikar.sparkapps.NetworkWordCount
>>>> > 14/05/22 14:26:47 INFO spark.SecurityManager: Changing view acls to: 
>>>> > shrikar
>>>> > 14/05/22 14:26:47 INFO spark.SecurityManager: SecurityManager:
>>>> > authentication disabled; ui acls disabled; users with view permissions:
>>>> > Set(shrikar)
>>>> > 14/05/22 14:26:48 INFO slf4j.Slf4jLogger: Slf4jLogger started
>>>> > 14/05/22 14:26:48 INFO Remoting: Starting remoting
>>>> > 14/05/22 14:26:48 INFO Remoting: Remoting started; listening on addresses
>>>> > :[akka.tcp://spark@192.168.10.88:49963]
>>>> > 14/05/22 14:26:48 INFO Remoting: Remoting now listens on addresses:
>>>> > [akka.tcp://spark@192.168.10.88:49963]
>>>> > 14/05/22 14:26:48 INFO spark.SparkEnv: Registering MapOutputTracker
>>>> > 14/05/22 14:26:48 INFO spark.SparkEnv: Registering BlockManagerMaster
>>>> > 14/05/22 14:26:48 INFO storage.DiskBlockManager: Created local directory 
>>>> > at
>>>> > /var/folders/r2/mbj08pb55n5d_9p8588xk5b00000gn/T/spark-local-20140522142648-0a14
>>>> > 14/05/22 14:26:48 INFO storage.MemoryStore: MemoryStore started with
>>>> > capacity 911.6 MB.
>>>> > 14/05/22 14:26:48 INFO network.ConnectionManager: Bound socket to port 
>>>> > 49964
>>>> > with id = ConnectionManagerId(192.168.10.88,49964)
>>>> > 14/05/22 14:26:48 INFO storage.BlockManagerMaster: Trying to register
>>>> > BlockManager
>>>> > 14/05/22 14:26:48 INFO storage.BlockManagerInfo: Registering block 
>>>> > manager
>>>> > 192.168.10.88:49964 with 911.6 MB RAM
>>>> > 14/05/22 14:26:48 INFO storage.BlockManagerMaster: Registered 
>>>> > BlockManager
>>>> > 14/05/22 14:26:48 INFO spark.HttpServer: Starting HTTP Server
>>>> > [error] (run-main) java.lang.NoClassDefFoundError:
>>>> > javax/servlet/http/HttpServletResponse
>>>> > java.lang.NoClassDefFoundError: javax/servlet/http/HttpServletResponse
>>>> > at org.apache.spark.HttpServer.start(HttpServer.scala:54)
>>>> > at
>>>> > org.apache.spark.broadcast.HttpBroadcast$.createServer(HttpBroadcast.scala:156)
>>>> > at
>>>> > org.apache.spark.broadcast.HttpBroadcast$.initialize(HttpBroadcast.scala:127)
>>>> > at
>>>> > org.apache.spark.broadcast.HttpBroadcastFactory.initialize(HttpBroadcastFactory.scala:31)
>>>> > at
>>>> > org.apache.spark.broadcast.BroadcastManager.initialize(BroadcastManager.scala:48)
>>>> > at
>>>> > org.apache.spark.broadcast.BroadcastManager.<init>(BroadcastManager.scala:35)
>>>> > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
>>>> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
>>>> > at
>>>> > org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:549)
>>>> > at
>>>> > org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:561)
>>>> > at
>>>> > org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
>>>> > at 
>>>> > com.shrikar.sparkapps.NetworkWordCount$.main(NetworkWordCount.scala:39)
>>>> > at com.shrikar.sparkapps.NetworkWordCount.main(NetworkWordCount.scala)
>>>> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> > at
>>>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>> > at
>>>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>> > at java.lang.reflect.Method.invoke(Method.java:597)
>>>> >
>>>> >
>>>> > Thanks,
>>>> > Shrikar
>>>> >
> 

Reply via email to