This looks promising. I'm trying to use spark-ec2 to launch a cluster with
Spark 1.5.0-SNAPSHOT and failing.
Where should we ask questions, report problems?
I couple of questions I have already after looking through the project:
- Where does the configuration file /spark-deployer.conf/ go
Please ask questions at the gitter channel for now.
https://gitter.im/pishen/spark-deployer
- spark-deployer.conf should be placed in your project's root directory
(beside build.sbt)
- To use the nightly builds, you can replace the value of spark-tgz-url
in spark-deployer.conf to the tgz you want
Thank you for the suggestions, actually this project is already on
spark-packages for 1~2 months.
Then I think what I need is some promotions :P
2015-08-25 23:51 GMT+08:00 saurfang [via Apache Spark Developers List]
ml-node+s1001551n1380...@n3.nabble.com:
This is very cool. I also have a sbt
You can add it to the spark packages i guess http://spark-packages.org/
Thanks
Best Regards
On Fri, Aug 14, 2015 at 1:45 PM, pishen tsai pishe...@gmail.com wrote:
Sorry for previous line-breaking format, try to resend the mail again.
I have written a sbt plugin called spark-deployer, which
Hello,
I have written a sbt plugin called spark-deployer, which is able to
deploy a standalone spark cluster on aws ec2 and submit jobs to it.
https://github.com/pishen/spark-deployer
Compared to current spark-ec2 script, this design may have several
benefits (features):
1. All the code are
Sorry for previous line-breaking format, try to resend the mail again.
I have written a sbt plugin called spark-deployer, which is able to deploy
a standalone spark cluster on aws ec2 and submit jobs to it.
https://github.com/pishen/spark-deployer
Compared to current spark-ec2 script, this