The docs on using sbt are here: https://github.com/apache/spark/blob/master/docs/building-spark.md#building-with-sbt
They'll be published with 1.2.0 presumably. On 2014년 11월 17일 (월) at 오후 2:49 Michael Armbrust <mich...@databricks.com> wrote: > > > > * I moved from sbt to maven in June specifically due to Andrew Or's > > describing mvn as the default build tool. Developers should keep in mind > > that jenkins uses mvn so we need to run mvn before submitting PR's - even > > if sbt were used for day to day dev work > > > > To be clear, I think that the PR builder actually uses sbt > <https://github.com/apache/spark/blob/master/dev/run-tests#L198> > currently, > but there are master builds that make sure maven doesn't break (amongst > other things). > > > > * In addition, as Sean has alluded to, the Intellij seems to comprehend > > the maven builds a bit more readily than sbt > > > > Yeah, this is a very good point. I have used `sbt/sbt gen-idea` in the > past, but I'm currently using the maven integration of inteliJ since it > seems more stable. > > > > * But for command line and day to day dev purposes: sbt sounds great to > > use Those sound bites you provided about exposing built-in test > databases > > for hive and for displaying available testcases are sweet. Any > > easy/convenient way to see "more of " those kinds of facilities available > > through sbt ? > > > > The Spark SQL developer readme > <https://github.com/apache/spark/tree/master/sql> has a little bit of > this, > but we really should have some documentation on using SBT as well. > > Integrating with those systems is generally easier if you are also working > > with Spark in Maven. (And I wouldn't classify all of those Maven-built > > systems as "legacy", Michael :) > > > Also a good point, though I've seen some pretty clever uses of sbt's > external project references to link spark into other projects. I'll > certainly admit I have a bias towards new shiny things in general though, > so my definition of legacy is probably skewed :) >