Spark 1.3.0 is not officially out yet, so i don't think sbt will download
the hadoop dependencies for your spark by itself. You could try manually
adding the hadoop dependencies yourself (hadoop-core, hadoop-common,
hadoop-client)
Thanks
Best Regards
On Wed, Mar 11, 2015 at 9:07 PM, Patcharee
Hi,
I have built spark version 1.3 and tried to use this in my spark scala
application. When I tried to compile and build the application by SBT, I
got error
bad symbolic reference. A signature in SparkContext.class refers to term
conf in value org.apache.hadoop which is not available
It