Re: How to set Spark executor memory?

2015-03-22 Thread Xi Shen
OK, I actually got the answer days ago from StackOverflow, but I did not
check it :(

When running in local mode, to set the executor memory

- when using spark-submit, use --driver-memory
- when running as a Java application, like executing from IDE, set the
-Xmx vm option


Thanks,
David


On Sun, Mar 22, 2015 at 2:10 PM Ted Yu yuzhih...@gmail.com wrote:

 bq. the BLAS native cannot be loaded

 Have you tried specifying --driver-library-path option ?

 Cheers

 On Sat, Mar 21, 2015 at 4:42 PM, Xi Shen davidshe...@gmail.com wrote:

 Yeah, I think it is harder to troubleshot the properties issues in a IDE.
 But the reason I stick to IDE is because if I use spark-submit, the BLAS
 native cannot be loaded. May be I should open another thread to discuss
 that.

 Thanks,
 David

 On Sun, 22 Mar 2015 10:38 Xi Shen davidshe...@gmail.com wrote:

 In the log, I saw

   MemoryStorage: MemoryStore started with capacity 6.7GB

 But I still can not find where to set this storage capacity.

 On Sat, 21 Mar 2015 20:30 Xi Shen davidshe...@gmail.com wrote:

 Hi Sean,

 It's getting strange now. If I ran from IDE, my executor memory is
 always set to 6.7G, no matter what value I set in code. I have check my
 environment variable, and there's no value of 6.7, or 12.5

 Any idea?

 Thanks,
 David

 On Tue, 17 Mar 2015 00:35 null jishnu.prat...@wipro.com wrote:

  Hi Xi Shen,

 You could set the spark.executor.memory in the code itself . new 
 SparkConf()..set(spark.executor.memory, 2g)

 Or you can try the -- spark.executor.memory 2g while submitting the
 jar.



 Regards

 Jishnu Prathap



 *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
 *Sent:* Monday, March 16, 2015 2:06 PM
 *To:* Xi Shen
 *Cc:* user@spark.apache.org
 *Subject:* Re: How to set Spark executor memory?



 By default spark.executor.memory is set to 512m, I'm assuming since
 you are submiting the job using spark-submit and it is not able to 
 override
 the value since you are running in local mode. Can you try it without 
 using
 spark-submit as a standalone project?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com
 wrote:

 I set it in code, not by configuration. I submit my jar file to local.
 I am working in my developer environment.



 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com
 wrote:

 How are you setting it? and how are you submitting the job?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com
 wrote:

 Hi,



 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very 
 strange
 value. why not 256MB, or just as what I set?



 What am I missing here?





 Thanks,

 David






  The information contained in this electronic message and any
 attachments to this message are intended for the exclusive use of the
 addressee(s) and may contain proprietary, confidential or privileged
 information. If you are not the intended recipient, you should not
 disseminate, distribute or copy this e-mail. Please notify the sender
 immediately and destroy all copies of this message and any attachments.
 WARNING: Computer viruses can be transmitted via email. The recipient
 should check this email and any attachments for the presence of viruses.
 The company accepts no liability for any damage caused by any virus
 transmitted by this email. www.wipro.com





Re: How to set Spark executor memory?

2015-03-21 Thread Xi Shen
Hi Sean,

It's getting strange now. If I ran from IDE, my executor memory is always
set to 6.7G, no matter what value I set in code. I have check my
environment variable, and there's no value of 6.7, or 12.5

Any idea?

Thanks,
David

On Tue, 17 Mar 2015 00:35 null jishnu.prat...@wipro.com wrote:

  Hi Xi Shen,

 You could set the spark.executor.memory in the code itself . new 
 SparkConf()..set(spark.executor.memory, 2g)

 Or you can try the -- spark.executor.memory 2g while submitting the jar.



 Regards

 Jishnu Prathap



 *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
 *Sent:* Monday, March 16, 2015 2:06 PM
 *To:* Xi Shen
 *Cc:* user@spark.apache.org
 *Subject:* Re: How to set Spark executor memory?



 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local. I
 am working in my developer environment.



 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi,



 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?



 What am I missing here?





 Thanks,

 David






  The information contained in this electronic message and any attachments
 to this message are intended for the exclusive use of the addressee(s) and
 may contain proprietary, confidential or privileged information. If you are
 not the intended recipient, you should not disseminate, distribute or copy
 this e-mail. Please notify the sender immediately and destroy all copies of
 this message and any attachments. WARNING: Computer viruses can be
 transmitted via email. The recipient should check this email and any
 attachments for the presence of viruses. The company accepts no liability
 for any damage caused by any virus transmitted by this email.
 www.wipro.com



Re: How to set Spark executor memory?

2015-03-21 Thread Sean Owen
If you are running from your IDE, then I don't know what you are
running or in what mode. The discussion here concerns using standard
mechanisms like spark-submit to configure executor memory. Please try
these first instead of trying to directly invoke Spark, which will
require more understanding of how the props are set.

On Sat, Mar 21, 2015 at 5:30 AM, Xi Shen davidshe...@gmail.com wrote:
 Hi Sean,

 It's getting strange now. If I ran from IDE, my executor memory is always
 set to 6.7G, no matter what value I set in code. I have check my environment
 variable, and there's no value of 6.7, or 12.5

 Any idea?

 Thanks,
 David


 On Tue, 17 Mar 2015 00:35 null jishnu.prat...@wipro.com wrote:

 Hi Xi Shen,

 You could set the spark.executor.memory in the code itself . new
 SparkConf()..set(spark.executor.memory, 2g)

 Or you can try the -- spark.executor.memory 2g while submitting the jar.



 Regards

 Jishnu Prathap



 From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
 Sent: Monday, March 16, 2015 2:06 PM
 To: Xi Shen
 Cc: user@spark.apache.org
 Subject: Re: How to set Spark executor memory?



 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?


 Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local. I
 am working in my developer environment.



 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?


 Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi,



 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?



 What am I missing here?





 Thanks,

 David







 The information contained in this electronic message and any attachments
 to this message are intended for the exclusive use of the addressee(s) and
 may contain proprietary, confidential or privileged information. If you are
 not the intended recipient, you should not disseminate, distribute or copy
 this e-mail. Please notify the sender immediately and destroy all copies of
 this message and any attachments. WARNING: Computer viruses can be
 transmitted via email. The recipient should check this email and any
 attachments for the presence of viruses. The company accepts no liability
 for any damage caused by any virus transmitted by this email. www.wipro.com

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to set Spark executor memory?

2015-03-21 Thread Xi Shen
In the log, I saw

  MemoryStorage: MemoryStore started with capacity 6.7GB

But I still can not find where to set this storage capacity.

On Sat, 21 Mar 2015 20:30 Xi Shen davidshe...@gmail.com wrote:

 Hi Sean,

 It's getting strange now. If I ran from IDE, my executor memory is always
 set to 6.7G, no matter what value I set in code. I have check my
 environment variable, and there's no value of 6.7, or 12.5

 Any idea?

 Thanks,
 David

 On Tue, 17 Mar 2015 00:35 null jishnu.prat...@wipro.com wrote:

  Hi Xi Shen,

 You could set the spark.executor.memory in the code itself . new 
 SparkConf()..set(spark.executor.memory, 2g)

 Or you can try the -- spark.executor.memory 2g while submitting the jar.



 Regards

 Jishnu Prathap



 *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
 *Sent:* Monday, March 16, 2015 2:06 PM
 *To:* Xi Shen
 *Cc:* user@spark.apache.org
 *Subject:* Re: How to set Spark executor memory?



 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local. I
 am working in my developer environment.



 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi,



 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?



 What am I missing here?





 Thanks,

 David






  The information contained in this electronic message and any
 attachments to this message are intended for the exclusive use of the
 addressee(s) and may contain proprietary, confidential or privileged
 information. If you are not the intended recipient, you should not
 disseminate, distribute or copy this e-mail. Please notify the sender
 immediately and destroy all copies of this message and any attachments.
 WARNING: Computer viruses can be transmitted via email. The recipient
 should check this email and any attachments for the presence of viruses.
 The company accepts no liability for any damage caused by any virus
 transmitted by this email. www.wipro.com




Re: How to set Spark executor memory?

2015-03-21 Thread Xi Shen
Yeah, I think it is harder to troubleshot the properties issues in a IDE.
But the reason I stick to IDE is because if I use spark-submit, the BLAS
native cannot be loaded. May be I should open another thread to discuss
that.

Thanks,
David

On Sun, 22 Mar 2015 10:38 Xi Shen davidshe...@gmail.com wrote:

 In the log, I saw

   MemoryStorage: MemoryStore started with capacity 6.7GB

 But I still can not find where to set this storage capacity.

 On Sat, 21 Mar 2015 20:30 Xi Shen davidshe...@gmail.com wrote:

 Hi Sean,

 It's getting strange now. If I ran from IDE, my executor memory is always
 set to 6.7G, no matter what value I set in code. I have check my
 environment variable, and there's no value of 6.7, or 12.5

 Any idea?

 Thanks,
 David

 On Tue, 17 Mar 2015 00:35 null jishnu.prat...@wipro.com wrote:

  Hi Xi Shen,

 You could set the spark.executor.memory in the code itself . new 
 SparkConf()..set(spark.executor.memory, 2g)

 Or you can try the -- spark.executor.memory 2g while submitting the jar.



 Regards

 Jishnu Prathap



 *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
 *Sent:* Monday, March 16, 2015 2:06 PM
 *To:* Xi Shen
 *Cc:* user@spark.apache.org
 *Subject:* Re: How to set Spark executor memory?



 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local. I
 am working in my developer environment.



 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi,



 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?



 What am I missing here?





 Thanks,

 David






  The information contained in this electronic message and any
 attachments to this message are intended for the exclusive use of the
 addressee(s) and may contain proprietary, confidential or privileged
 information. If you are not the intended recipient, you should not
 disseminate, distribute or copy this e-mail. Please notify the sender
 immediately and destroy all copies of this message and any attachments.
 WARNING: Computer viruses can be transmitted via email. The recipient
 should check this email and any attachments for the presence of viruses.
 The company accepts no liability for any damage caused by any virus
 transmitted by this email. www.wipro.com




Re: How to set Spark executor memory?

2015-03-21 Thread Ted Yu
bq. the BLAS native cannot be loaded

Have you tried specifying --driver-library-path option ?

Cheers

On Sat, Mar 21, 2015 at 4:42 PM, Xi Shen davidshe...@gmail.com wrote:

 Yeah, I think it is harder to troubleshot the properties issues in a IDE.
 But the reason I stick to IDE is because if I use spark-submit, the BLAS
 native cannot be loaded. May be I should open another thread to discuss
 that.

 Thanks,
 David

 On Sun, 22 Mar 2015 10:38 Xi Shen davidshe...@gmail.com wrote:

 In the log, I saw

   MemoryStorage: MemoryStore started with capacity 6.7GB

 But I still can not find where to set this storage capacity.

 On Sat, 21 Mar 2015 20:30 Xi Shen davidshe...@gmail.com wrote:

 Hi Sean,

 It's getting strange now. If I ran from IDE, my executor memory is
 always set to 6.7G, no matter what value I set in code. I have check my
 environment variable, and there's no value of 6.7, or 12.5

 Any idea?

 Thanks,
 David

 On Tue, 17 Mar 2015 00:35 null jishnu.prat...@wipro.com wrote:

  Hi Xi Shen,

 You could set the spark.executor.memory in the code itself . new 
 SparkConf()..set(spark.executor.memory, 2g)

 Or you can try the -- spark.executor.memory 2g while submitting the jar.



 Regards

 Jishnu Prathap



 *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
 *Sent:* Monday, March 16, 2015 2:06 PM
 *To:* Xi Shen
 *Cc:* user@spark.apache.org
 *Subject:* Re: How to set Spark executor memory?



 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local.
 I am working in my developer environment.



 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?


   Thanks

 Best Regards



 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com
 wrote:

 Hi,



 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?



 What am I missing here?





 Thanks,

 David






  The information contained in this electronic message and any
 attachments to this message are intended for the exclusive use of the
 addressee(s) and may contain proprietary, confidential or privileged
 information. If you are not the intended recipient, you should not
 disseminate, distribute or copy this e-mail. Please notify the sender
 immediately and destroy all copies of this message and any attachments.
 WARNING: Computer viruses can be transmitted via email. The recipient
 should check this email and any attachments for the presence of viruses.
 The company accepts no liability for any damage caused by any virus
 transmitted by this email. www.wipro.com




RE: How to set Spark executor memory?

2015-03-16 Thread jishnu.prathap
Hi Xi Shen,

You could set the spark.executor.memory in the code itself . new 
SparkConf()..set(spark.executor.memory, 2g)
Or you can try the -- spark.executor.memory 2g while submitting the jar.

Regards
Jishnu Prathap

From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Monday, March 16, 2015 2:06 PM
To: Xi Shen
Cc: user@spark.apache.org
Subject: Re: How to set Spark executor memory?

By default spark.executor.memory is set to 512m, I'm assuming since you are 
submiting the job using spark-submit and it is not able to override the value 
since you are running in local mode. Can you try it without using spark-submit 
as a standalone project?

Thanks
Best Regards

On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen 
davidshe...@gmail.commailto:davidshe...@gmail.com wrote:

I set it in code, not by configuration. I submit my jar file to local. I am 
working in my developer environment.

On Mon, 16 Mar 2015 18:28 Akhil Das 
ak...@sigmoidanalytics.commailto:ak...@sigmoidanalytics.com wrote:
How are you setting it? and how are you submitting the job?

Thanks
Best Regards

On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen 
davidshe...@gmail.commailto:davidshe...@gmail.com wrote:
Hi,

I have set spark.executor.memory to 2048m, and in the UI Environment page, I 
can see this value has been set correctly. But in the Executors page, I saw 
there's only 1 executor and its memory is 265.4MB. Very strange value. why not 
256MB, or just as what I set?

What am I missing here?


Thanks,
David



The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments. WARNING: Computer viruses can be transmitted via email. The 
recipient should check this email and any attachments for the presence of 
viruses. The company accepts no liability for any damage caused by any virus 
transmitted by this email. www.wipro.com


Re: How to set Spark executor memory?

2015-03-16 Thread Sean Owen
There are a number of small misunderstandings here.

In the first instance, the executor memory is not actually being set
to 2g and the default of 512m is being used. If you are writing code
to launch an app, then you are trying to duplicate what spark-submit
does, and you don't use spark-submit. If you use spark-submit, your
configuration happens too late.

The memory you see in the UI is not total executor memory. it is
memory available for caching. The default formula is actually 0.6 *
0.9 * total, not 0.6 * total.

This is not a function of your machines total memory, but of the
configured executor memory.

if this value is 6.7GB it implies that you somehow configured the
executors to use 12.4GB of memory. Double-check for typos and maybe
confirm what figure you are quoting here.

In the last instance -- you are looking at driver memory, not executor
memory. The 1g you are trying to configure affects executors.

On Mon, Mar 16, 2015 at 9:21 AM, Akhil Das ak...@sigmoidanalytics.com wrote:
 Strange, even i'm having it while running in local mode.



 I set it as .set(spark.executor.memory, 1g)

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 2:43 PM, Xi Shen davidshe...@gmail.com wrote:

 I set spark.executor.memory to 2048m. If the executor storage memory
 is 0.6 of executor memory, it should be 2g * 0.6 = 1.2g.

 My machine has 56GB memory, and 0.6 of that should be 33.6G...I hate math
 xD


 On Mon, Mar 16, 2015 at 7:59 PM Akhil Das ak...@sigmoidanalytics.com
 wrote:

 How much memory are you having on your machine? I think default value is
 0.6 of the spark.executor.memory as you can see from here.

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi Akhil,

 Yes, you are right. If I ran the program from IDE as a normal java
 program, the executor's memory is increased...but not to 2048m, it is set 
 to
 6.7GB...Looks like there's some formula to calculate this value.


 Thanks,
 David


 On Mon, Mar 16, 2015 at 7:36 PM Akhil Das ak...@sigmoidanalytics.com
 wrote:

 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override 
 the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local.
 I am working in my developer environment.


 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com
 wrote:

 How are you setting it? and how are you submitting the job?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com
 wrote:

 Hi,

 I have set spark.executor.memory to 2048m, and in the UI
 Environment page, I can see this value has been set correctly. But 
 in the
 Executors page, I saw there's only 1 executor and its memory is 
 265.4MB.
 Very strange value. why not 256MB, or just as what I set?

 What am I missing here?


 Thanks,
 David






-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: How to set Spark executor memory?

2015-03-16 Thread Akhil Das
How are you setting it? and how are you submitting the job?

Thanks
Best Regards

On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi,

 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?

 What am I missing here?


 Thanks,
 David




Re: How to set Spark executor memory?

2015-03-16 Thread Xi Shen
I set it in code, not by configuration. I submit my jar file to local. I am
working in my developer environment.

On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi,

 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?

 What am I missing here?


 Thanks,
 David





Re: How to set Spark executor memory?

2015-03-16 Thread Akhil Das
By default spark.executor.memory is set to 512m, I'm assuming since you are
submiting the job using spark-submit and it is not able to override the
value since you are running in local mode. Can you try it without using
spark-submit as a standalone project?

Thanks
Best Regards

On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local. I
 am working in my developer environment.

 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi,

 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?

 What am I missing here?


 Thanks,
 David





Re: How to set Spark executor memory?

2015-03-16 Thread Xi Shen
Hi Akhil,

Yes, you are right. If I ran the program from IDE as a normal java program,
the executor's memory is increased...but not to 2048m, it is set to
6.7GB...Looks like there's some formula to calculate this value.


Thanks,
David


On Mon, Mar 16, 2015 at 7:36 PM Akhil Das ak...@sigmoidanalytics.com
wrote:

 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local. I
 am working in my developer environment.

 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi,

 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
 value. why not 256MB, or just as what I set?

 What am I missing here?


 Thanks,
 David






Re: How to set Spark executor memory?

2015-03-16 Thread Akhil Das
How much memory are you having on your machine? I think default value is
0.6 of the spark.executor.memory as you can see from here
http://spark.apache.org/docs/1.2.1/configuration.html#execution-behavior.

Thanks
Best Regards

On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi Akhil,

 Yes, you are right. If I ran the program from IDE as a normal java
 program, the executor's memory is increased...but not to 2048m, it is set
 to 6.7GB...Looks like there's some formula to calculate this value.


 Thanks,
 David


 On Mon, Mar 16, 2015 at 7:36 PM Akhil Das ak...@sigmoidanalytics.com
 wrote:

 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local. I
 am working in my developer environment.

 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com
 wrote:

 Hi,

 I have set spark.executor.memory to 2048m, and in the UI Environment
 page, I can see this value has been set correctly. But in the Executors
 page, I saw there's only 1 executor and its memory is 265.4MB. Very 
 strange
 value. why not 256MB, or just as what I set?

 What am I missing here?


 Thanks,
 David






Re: How to set Spark executor memory?

2015-03-16 Thread Xi Shen
I set spark.executor.memory to 2048m. If the executor storage memory is
0.6 of executor memory, it should be 2g * 0.6 = 1.2g.

My machine has 56GB memory, and 0.6 of that should be 33.6G...I hate math xD


On Mon, Mar 16, 2015 at 7:59 PM Akhil Das ak...@sigmoidanalytics.com
wrote:

 How much memory are you having on your machine? I think default value is
 0.6 of the spark.executor.memory as you can see from here
 http://spark.apache.org/docs/1.2.1/configuration.html#execution-behavior
 .

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi Akhil,

 Yes, you are right. If I ran the program from IDE as a normal java
 program, the executor's memory is increased...but not to 2048m, it is set
 to 6.7GB...Looks like there's some formula to calculate this value.


 Thanks,
 David


 On Mon, Mar 16, 2015 at 7:36 PM Akhil Das ak...@sigmoidanalytics.com
 wrote:

 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local.
 I am working in my developer environment.

 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com wrote:

 How are you setting it? and how are you submitting the job?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com
 wrote:

 Hi,

 I have set spark.executor.memory to 2048m, and in the UI
 Environment page, I can see this value has been set correctly. But in 
 the
 Executors page, I saw there's only 1 executor and its memory is 
 265.4MB.
 Very strange value. why not 256MB, or just as what I set?

 What am I missing here?


 Thanks,
 David







Re: How to set Spark executor memory?

2015-03-16 Thread Akhil Das
Strange, even i'm having it while running in local mode.

[image: Inline image 1]

I set it as .set(spark.executor.memory, 1g)

Thanks
Best Regards

On Mon, Mar 16, 2015 at 2:43 PM, Xi Shen davidshe...@gmail.com wrote:

 I set spark.executor.memory to 2048m. If the executor storage memory
 is 0.6 of executor memory, it should be 2g * 0.6 = 1.2g.

 My machine has 56GB memory, and 0.6 of that should be 33.6G...I hate math
 xD


 On Mon, Mar 16, 2015 at 7:59 PM Akhil Das ak...@sigmoidanalytics.com
 wrote:

 How much memory are you having on your machine? I think default value is
 0.6 of the spark.executor.memory as you can see from here
 http://spark.apache.org/docs/1.2.1/configuration.html#execution-behavior
 .

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen davidshe...@gmail.com wrote:

 Hi Akhil,

 Yes, you are right. If I ran the program from IDE as a normal java
 program, the executor's memory is increased...but not to 2048m, it is set
 to 6.7GB...Looks like there's some formula to calculate this value.


 Thanks,
 David


 On Mon, Mar 16, 2015 at 7:36 PM Akhil Das ak...@sigmoidanalytics.com
 wrote:

 By default spark.executor.memory is set to 512m, I'm assuming since you
 are submiting the job using spark-submit and it is not able to override the
 value since you are running in local mode. Can you try it without using
 spark-submit as a standalone project?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen davidshe...@gmail.com wrote:

 I set it in code, not by configuration. I submit my jar file to local.
 I am working in my developer environment.

 On Mon, 16 Mar 2015 18:28 Akhil Das ak...@sigmoidanalytics.com
 wrote:

 How are you setting it? and how are you submitting the job?

 Thanks
 Best Regards

 On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen davidshe...@gmail.com
 wrote:

 Hi,

 I have set spark.executor.memory to 2048m, and in the UI
 Environment page, I can see this value has been set correctly. But in 
 the
 Executors page, I saw there's only 1 executor and its memory is 
 265.4MB.
 Very strange value. why not 256MB, or just as what I set?

 What am I missing here?


 Thanks,
 David