Awesome thanks a TON. It works
There is a clash in the UI port initially but looks like it creates a second UI 
port at 4041 for the second user wanting to use the spark-shell 14/10/01 
17:34:38 INFO JettyUtils: Failed to create UI at port, 4040. Trying 
again.14/10/01 17:34:38 INFO JettyUtils: Error was: 
Failure(java.net.BindException: Address already in use)14/10/01 17:34:38 INFO 
SparkUI: Started SparkUI at http://hadoop02:4041
sanjay
      From: Matei Zaharia <matei.zaha...@gmail.com>
 To: Sanjay Subramanian <sanjaysubraman...@yahoo.com> 
Cc: "user@spark.apache.org" <user@spark.apache.org> 
 Sent: Wednesday, October 1, 2014 5:19 PM
 Subject: Re: Multiple spark shell sessions
   
You need to set --total-executor-cores to limit how many total cores it grabs 
on the cluster. --executor-cores is just for each individual executor, but it 
will try to launch many of them.
Matei


On Oct 1, 2014, at 4:29 PM, Sanjay Subramanian 
<sanjaysubraman...@yahoo.com.INVALID> wrote:

hey guys

I am using  spark 1.0.0+cdh5.1.0+41
When two users try to run "spark-shell" , the first guy's spark-shell shows
active in the 18080 Web UI but the second user shows WAITING and the shell
has a bunch of errors but does go the spark-shell and "sc.master" seems to
point to the correct master.

I tried controlling the number of cores in the "spark-shell" command
--executor-cores 8
Does not work

thanks

sanjay 
   

   



  

Reply via email to