Yeah, interesting question of what is the better default for the single set of artifacts published to Maven. I think there's an argument for Hadoop 2 and perhaps Hive for the 2.10 build too. Pros and cons discussed more at
https://issues.apache.org/jira/browse/SPARK-5134 https://github.com/apache/spark/pull/3917 On Sun, Mar 8, 2015 at 7:42 PM, Matei Zaharia <matei.zaha...@gmail.com> wrote: > +1 > > Tested it on Mac OS X. > > One small issue I noticed is that the Scala 2.11 build is using Hadoop 1 > without Hive, which is kind of weird because people will more likely want > Hadoop 2 with Hive. So it would be good to publish a build for that > configuration instead. We can do it if we do a new RC, or it might be that > binary builds may not need to be voted on (I forgot the details there). > > Matei --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org