Hi all, I am going over this official tutorial on standalone scala project in cloudera virtual machine
I am using Spark 1.5.0 and Scala 2.10.4, and I change the parameters in the sparkpi.sbt file as the following: name := "SparkPi Project" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" after running sbt package I am getting below error [info] Updating {file:/home/cloudera/sampleData/spark/SparkPi/}sparkpi... [info] Resolving org.fusesource.jansi#jansi;1.4 ... [info] Done updating. [info] Compiling 1 Scala source to /home/cloudera/sampleData/spark/SparkPi/target/scala-2.10/classes... [error] /home/cloudera/sampleData/spark/SparkPi/src/main/scala/SparkPi.scala:2: object apache is not a member of package org [error] import org.apache.spark._ [error] ^ [error] /home/cloudera/sampleData/spark/SparkPi/src/main/scala/SparkPi.scala:8: not found: type SparkConf [error] val conf = new SparkConf().setAppName("Spark Pi") [error] ^ [error] /home/cloudera/sampleData/spark/SparkPi/src/main/scala/SparkPi.scala:9: not found: type SparkContext [error] val spark = new SparkContext(conf) [error] ^ [error] three errors found [error] (compile:compileIncremental) Compilation failed [error] Total time: 7 s, completed Jan 8, 2016 2:40:57 PM Tried changing sbt file by refering from goggle but still i am with the same error. Earliest help is much appreciated modified my sbt file like below name := "SparkPi Project" version := "1.5.0" scalaVersion := "2.10" libraryDependencies += "org.apache.spark" % System.getenv.get("SPARK_MODULE") % System.getenv.get("SPARK_VERSION") resolvers ++= Seq( "Spark Release Repository" at System.getenv.get("SPARK_RELEASE_REPOSITORY"), "Akka Repository" at "http://repo.akka.io/releases/", "Spray Repository" at "http://repo.spray.cc/") still same error -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Standalone-Scala-Project-sbt-package-erroring-out-tp25924.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org