can't submit my application on standalone spark cluster

2014-08-06 Thread Andres Gomez Ferrer
Hi all, My name is Andres and I'm starting to use Apache Spark. I try to submit my spark.jar to my cluster using this: spark-submit --class net.redborder.spark.RedBorderApplication --master spark://pablo02:7077 redborder-spark-selfcontained.jar But when I did it .. My worker die .. and my

Re: can't submit my application on standalone spark cluster

2014-08-06 Thread Akhil Das
Looks like a netty conflict there, most likely you are having mutiple versions of netty jars (eg: netty-3.6.6.Final.jar, netty-3.2.2.Final.jar, netty-all-4.0.13.Final.jar), you only require 3.6.6 i believe. a quick fix would be to remove the rest of them. Thanks Best Regards On Wed, Aug 6, 2014

Re: can't submit my application on standalone spark cluster

2014-08-06 Thread Andrew Or
Hi Andres, If you're using the EC2 scripts to start your standalone cluster, you can use ~/spark-ec2/copy-dir --delete ~/spark to sync your jars across the cluster. Note that you will need to restart the Master and the Workers afterwards through sbin/start-all.sh and sbin/stop-all.sh. If you're