Hi All,
I just downloaded the Scala IDE for Eclipse. After I created a Spark project
and clicked Run there was an error on this line of code import
org.apache.spark.SparkContext: object apache is not a member of package
org. I guess I need to import the Spark dependency into Scala IDE for
Project-Properties-Java Build Path-Add External Jars
Add the /spark-1.0.0-bin-hadoop2/lib/spark-assembly-1.0.0-hadoop2.2.0.jar
Cheers
K/
On Sun, Jun 8, 2014 at 8:06 AM, Carter gyz...@hotmail.com wrote:
Hi All,
I just downloaded the Scala IDE for Eclipse. After I created a Spark
project
and
This will make the compilation pass but you may not be able to run it
correctly.
I used maven adding these two jars (I use Hadoop 1), maven added their
dependent jars (a lot) for me.
dependency
groupIdorg.apache.spark/groupId
artifactIdspark-core_2.10/artifactId
Thanks a lot Krishna, this works for me.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-compile-a-Spark-project-in-Scala-IDE-for-Eclipse-tp7197p7223.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Thanks for your reply Wei, will try this.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-compile-a-Spark-project-in-Scala-IDE-for-Eclipse-tp7197p7224.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.