Hi
  I am developing an application based on spark-1.6. my lib dependencies is
just as

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.6.0"
)

 it use hadoop 2.2.0 as the  default hadoop version which not my
preference.I want to change the hadoop versio when import spark .How to
achieve that.Do I need to recompile the spark source code with the hadoop
version I want to have?


Regards.
Best wishes.

Reply via email to