Hi Madhvi,
If I only install spark on one node, and use spark-submit to run an
application, which are the Worker nodes? Any where are the executors ?
Thanks,
Xiaohe
On Thu, Apr 30, 2015 at 12:52 PM, madhvi madhvi.gu...@orkash.com wrote:
Hi,
Follow the instructions to install on the following
Hi,
you have to specify the worker nodes of the spark cluster at the time of
configurations of the cluster.
Thanks
Madhvi
On Thursday 30 April 2015 01:30 PM, xiaohe lan wrote:
Hi Madhvi,
If I only install spark on one node, and use spark-submit to run an
application, which are the Worker
You don't need to install Spark. Just download or build a Spark package
that matches your Yarn version. And ensure that HADOOP_CONF_DIR or
YARN_CONF_DIR points to the directory which contains the (client side)
configuration files for the Hadoop cluster.
See instructions here:
Hi,
Follow the instructions to install on the following link:
http://mbonaci.github.io/mbo-spark/
You dont need to install spark on every node.Just install it on one node
or you can install it on remote system also and made a spark cluster.
Thanks
Madhvi
On Thursday 30 April 2015 09:31 AM,
Hi experts,
I see spark on yarn has yarn-client and yarn-cluster mode. I also have a 5
nodes hadoop cluster (hadoop 2.4). How to install spark if I want to try
the spark on yarn mode.
Do I need to install spark on the each node of hadoop cluster ?
Thanks,
Xiaohe