Hi, 

I’m looking at the online docs for building spark 1.4.1 … 

http://spark.apache.org/docs/latest/building-spark.html 
<http://spark.apache.org/docs/latest/building-spark.html> 

I was interested in building spark for Scala 2.11 (latest scala) and also for 
Hive and JDBC support. 

The docs say:
“
To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11 
property:
dev/change-version-to-2.11.sh
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package
“ 
So… 
Is there a reason I shouldn’t build against hadoop-2.6 ? 

If I want to add the Thirft and Hive support, is it possible? 
Looking at the Scala build, it looks like hive support is being built? 
(Looking at the stdout messages…)
Should the docs be updated? Am I missing something? 
(Dean W. can confirm, I am completely brain dead. ;-) 

Thx

-Mike
PS. Yes I can probably download a prebuilt image, but I’m a glutton for 
punishment. ;-) 

Reply via email to