Can Dependencies Be Resolved on Spark Cluster?

2015-06-29 Thread SLiZn Liu
Hey Spark Users, I'm writing a demo with Spark and HBase. What I've done is packaging a **fat jar**: place dependencies in `build.sbt`, and use `sbt assembly` to package **all dependencies** into one big jar. The rest work is copy the fat jar to Spark master node and then launch by `spark-submit`.

Re: Can Dependencies Be Resolved on Spark Cluster?

2015-06-29 Thread Burak Yavuz
You can pass `--packages your:comma-separated:maven-dependencies` to spark submit if you have Spark 1.3 or greater. Best regards, Burak On Mon, Jun 29, 2015 at 10:46 PM, SLiZn Liu wrote: > Hey Spark Users, > > I'm writing a demo with Spark and HBase. What I've done is packaging a > **fat jar**:

Re: Can Dependencies Be Resolved on Spark Cluster?

2015-06-29 Thread SLiZn Liu
Hi Burak, Is `--package` flag only available for maven, no sbt support? On Tue, Jun 30, 2015 at 2:26 PM Burak Yavuz wrote: > You can pass `--packages your:comma-separated:maven-dependencies` to spark > submit if you have Spark 1.3 or greater. > > Best regards, > Burak > > On Mon, Jun 29, 2015 a

Re: Can Dependencies Be Resolved on Spark Cluster?

2015-06-30 Thread Burak Yavuz
Hi, In your build.sbt file, all the dependencies you have (hopefully they're not too many, they only have a lot of transitive dependencies), for example: ``` libraryDependencies += "org.apache.hbase" % "hbase" % "1.1.1" libraryDependencies += "junit" % "junit" % "x" resolvers += "Some other repo"

Re: Can Dependencies Be Resolved on Spark Cluster?

2015-07-01 Thread SLiZn Liu
Thanks for the enlightening solution! On Wed, Jul 1, 2015 at 12:03 AM Burak Yavuz wrote: > Hi, > In your build.sbt file, all the dependencies you have (hopefully they're > not too many, they only have a lot of transitive dependencies), for example: > ``` > libraryDependencies += "org.apache.hbas