Hi all,
I want to use two nodes for test, one as master, the other worker.
Can I submit the example application included in Spark source code
tarball on master to let it run on the worker?
What should I do?
BR,
Theo
-
To
This is how the spark-cluster looks like
So your driver program (example application) can be ran on the master (or
anywhere which has access to the master - clustermanager) and the workers
will execute it.
Thanks
Best Regards
On Fri, Oct 10, 2014 at 2:47 PM, Theodore Si sjyz...@gmail.com
But I cannot do this via using
./bin/run-example SparkPi 10
right?
On Fri, Oct 10, 2014 at 6:04 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
This is how the spark-cluster looks like
So your driver program (example application) can be ran on the master (or
anywhere which has access to
Should I pack the example into a jar file and submit it on master?
On Fri, Oct 10, 2014 at 9:32 PM, Theodore Si sjyz...@gmail.com wrote:
But I cannot do this via using
./bin/run-example SparkPi 10
right?
On Fri, Oct 10, 2014 at 6:04 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
This
Yes, you can run it with --master=spark://your-spark-uri:7077 i believe.
Thanks
Best Regards
On Fri, Oct 10, 2014 at 7:03 PM, Theodore Si sjyz...@gmail.com wrote:
Should I pack the example into a jar file and submit it on master?
On Fri, Oct 10, 2014 at 9:32 PM, Theodore Si sjyz...@gmail.com
spark-submit --class “Classname --master yarn-cluster
jarfile(withcomplete path)
This should work.
On Fri, Oct 10, 2014 at 8:36 AM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Yes, you can run it with --master=spark://your-spark-uri:7077 i believe.
Thanks
Best Regards
On Fri, Oct 10,