Nan Zhu, its the later, I want to distribute the tasks to the cluster
[machines available.]

If i set the SPARK_MASTER_IP at the other machines and set the slaves-IP in
the /conf/slaves at the master node, will the interactive shell code run at
the master get distributed across multiple machines ???





On Wed, Mar 26, 2014 at 6:32 PM, Nan Zhu <zhunanmcg...@gmail.com> wrote:

>  what do you mean by run across the cluster?
>
> you want to start the spark-shell across the cluster or you want to
> distribute tasks to multiple machines?
>
> if the former case, yes, as long as you indicate the right master URL
>
> if the later case, also yes, you can observe the distributed task in the
> Spark UI
>
> --
> Nan Zhu
>
> On Wednesday, March 26, 2014 at 8:54 AM, Sai Prasanna wrote:
>
> Is it possible to run across cluster using Spark Interactive Shell ?
>
> To be more explicit, is the procedure similar to running standalone
> master-slave spark.
>
> I want to execute my code in  the interactive shell in the master-node,
> and it should run across the cluster [say 5 node]. Is the procedure similar
> ???
>
>
>
>
>
> --
> *Sai Prasanna. AN*
> *II M.Tech (CS), SSSIHL*
>
>
> *Entire water in the ocean can never sink a ship, Unless it gets inside.
> All the pressures of life can never hurt you, Unless you let them in.*
>
>
>


-- 
*Sai Prasanna. AN*
*II M.Tech (CS), SSSIHL*


*Entire water in the ocean can never sink a ship, Unless it gets inside.All
the pressures of life can never hurt you, Unless you let them in.*

Reply via email to