Re: How to run customized Spark on EC2?

2015-04-30 Thread Akhil Das
This is how i used to do it:

- Login to the ec2 cluster (master)
- Make changes to the spark, and build it.
- Stop the old installation of spark (sbin/stop-all.sh)
- Copy old installation conf/* to modified version's conf/
- Rsync modified version to all slaves
- do sbin/start-all.sh from the modified version.

You can also simply replace the assembly jar (on master and worker) with
the newly build jar if the versions are all same.


Thanks
Best Regards

On Tue, Apr 28, 2015 at 10:59 PM, Bo Fu b...@uchicago.edu wrote:

 Hi experts,

 I have an issue. I added some timestamps in Spark source code and built it
 using:

 mvn package -DskipTests

 I checked the new version in my own computer and it works. However, when I
 ran spark on EC2, the spark code EC2 machines ran is the original version.

 Anyone knows how to deploy the changed spark source code into EC2?
 Thx a lot


 Bo Fu
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




How to run customized Spark on EC2?

2015-04-28 Thread Bo Fu
Hi experts,

I have an issue. I added some timestamps in Spark source code and built it 
using:

mvn package -DskipTests

I checked the new version in my own computer and it works. However, when I ran 
spark on EC2, the spark code EC2 machines ran is the original version.

Anyone knows how to deploy the changed spark source code into EC2?
Thx a lot


Bo Fu
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org