There is a file $SPARK_HOME/conf/spark-env.sh which comes readily configured 
with the MASTER variable. So if you start pyspark or spark-shell from the ec2 
login machine you will connect to the Spark master.


On 29 Jan 2015, at 01:11, Mohit Singh <mohit1...@gmail.com> wrote:

> Hi,
>   Probably a naive question.. But I am creating a spark cluster on ec2 using 
> the ec2 scripts in there..
> But is there a master param I need to set..
> ./bin/pyspark --master [ ] ??
> I don't yet fully understand the ec2 concepts so just wanted to confirm this??
> Thanks
> 
> -- 
> Mohit
> 
> "When you want success as badly as you want the air, then you will get it. 
> There is no other secret of success."
> -Socrates


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to