Hi, Before you try to do it inside another environment like an IDE, could you build Spark using mvn or sbt and only when successful try to run SparkPi using spark-submit run-example. With that, you could try to have a complete environment inside your beloved IDE (and I'm very glad to hear it's IDEA :))
Pozdrawiam, Jacek Laskowski ---- https://medium.com/@jaceklaskowski/ Mastering Apache Spark http://bit.ly/mastering-apache-spark Follow me at https://twitter.com/jaceklaskowski On Thu, Jun 16, 2016 at 1:37 AM, Krishna Kalyan <krishnakaly...@gmail.com> wrote: > Hello, > I am faced with problems when I try to run SparkPi.scala. > I took the following steps below: > a) git pull https://github.com/apache/spark > b) Import the project in Intellij as a maven project > c) Run 'SparkPi' > > Error Below: > Information:16/06/16 01:34 - Compilation completed with 10 errors and 5 > warnings in 5s 843ms > Warning:scalac: Class org.jboss.netty.channel.ChannelFactory not found - > continuing with a stub. > Warning:scalac: Class org.jboss.netty.channel.ChannelPipelineFactory not > found - continuing with a stub. > Warning:scalac: Class org.jboss.netty.handler.execution.ExecutionHandler not > found - continuing with a stub. > Warning:scalac: Class org.jboss.netty.channel.group.ChannelGroup not found - > continuing with a stub. > Warning:scalac: Class com.google.common.collect.ImmutableMap not found - > continuing with a stub. > /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala > Error:(45, 66) not found: type SparkFlumeProtocol > val transactionTimeout: Int, val backOffInterval: Int) extends > SparkFlumeProtocol with Logging { > ^ > Error:(70, 39) not found: type EventBatch > override def getEventBatch(n: Int): EventBatch = { > ^ > Error:(85, 13) not found: type EventBatch > new EventBatch("Spark sink has been stopped!", "", > java.util.Collections.emptyList()) > ^ > /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala > Error:(80, 22) not found: type EventBatch > def getEventBatch: EventBatch = { > ^ > Error:(48, 37) not found: type EventBatch > @volatile private var eventBatch: EventBatch = new EventBatch("Unknown > Error", "", > ^ > Error:(48, 54) not found: type EventBatch > @volatile private var eventBatch: EventBatch = new EventBatch("Unknown > Error", "", > ^ > Error:(115, 41) not found: type SparkSinkEvent > val events = new util.ArrayList[SparkSinkEvent](maxBatchSize) > ^ > Error:(146, 28) not found: type EventBatch > eventBatch = new EventBatch("", seqNum, events) > ^ > /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala > Error:(25, 27) not found: type EventBatch > def isErrorBatch(batch: EventBatch): Boolean = { > ^ > /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala > Error:(86, 51) not found: type SparkFlumeProtocol > val responder = new SpecificResponder(classOf[SparkFlumeProtocol], > handler.get) > > Thanks, > Krishan --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org