Yeah, I think it is harder to troubleshot the properties issues in a IDE.
But the reason I stick to IDE is because if I use spark-submit, the BLAS
native cannot be loaded. May be I should open another thread to discuss
that.

Thanks,
David

On Sun, 22 Mar 2015 10:38 Xi Shen <davidshe...@gmail.com> wrote:

> In the log, I saw
>
>   MemoryStorage: MemoryStore started with capacity 6.7GB
>
> But I still can not find where to set this storage capacity.
>
> On Sat, 21 Mar 2015 20:30 Xi Shen <davidshe...@gmail.com> wrote:
>
>> Hi Sean,
>>
>> It's getting strange now. If I ran from IDE, my executor memory is always
>> set to 6.7G, no matter what value I set in code. I have check my
>> environment variable, and there's no value of 6.7, or 12.5
>>
>> Any idea?
>>
>> Thanks,
>> David
>>
>> On Tue, 17 Mar 2015 00:35 null <jishnu.prat...@wipro.com> wrote:
>>
>>>  Hi Xi Shen,
>>>
>>> You could set the spark.executor.memory in the code itself . new 
>>> SparkConf()..set("spark.executor.memory", "2g")
>>>
>>> Or you can try the -- spark.executor.memory 2g while submitting the jar.
>>>
>>>
>>>
>>> Regards
>>>
>>> Jishnu Prathap
>>>
>>>
>>>
>>> *From:* Akhil Das [mailto:ak...@sigmoidanalytics.com]
>>> *Sent:* Monday, March 16, 2015 2:06 PM
>>> *To:* Xi Shen
>>> *Cc:* user@spark.apache.org
>>> *Subject:* Re: How to set Spark executor memory?
>>>
>>>
>>>
>>> By default spark.executor.memory is set to 512m, I'm assuming since you
>>> are submiting the job using spark-submit and it is not able to override the
>>> value since you are running in local mode. Can you try it without using
>>> spark-submit as a standalone project?
>>>
>>>
>>>   Thanks
>>>
>>> Best Regards
>>>
>>>
>>>
>>> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <davidshe...@gmail.com> wrote:
>>>
>>> I set it in code, not by configuration. I submit my jar file to local. I
>>> am working in my developer environment.
>>>
>>>
>>>
>>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com> wrote:
>>>
>>> How are you setting it? and how are you submitting the job?
>>>
>>>
>>>   Thanks
>>>
>>> Best Regards
>>>
>>>
>>>
>>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <davidshe...@gmail.com> wrote:
>>>
>>> Hi,
>>>
>>>
>>>
>>> I have set spark.executor.memory to 2048m, and in the UI "Environment"
>>> page, I can see this value has been set correctly. But in the "Executors"
>>> page, I saw there's only 1 executor and its memory is 265.4MB. Very strange
>>> value. why not 256MB, or just as what I set?
>>>
>>>
>>>
>>> What am I missing here?
>>>
>>>
>>>
>>>
>>>
>>> Thanks,
>>>
>>> David
>>>
>>>
>>>
>>>
>>>
>>>
>>>  The information contained in this electronic message and any
>>> attachments to this message are intended for the exclusive use of the
>>> addressee(s) and may contain proprietary, confidential or privileged
>>> information. If you are not the intended recipient, you should not
>>> disseminate, distribute or copy this e-mail. Please notify the sender
>>> immediately and destroy all copies of this message and any attachments.
>>> WARNING: Computer viruses can be transmitted via email. The recipient
>>> should check this email and any attachments for the presence of viruses.
>>> The company accepts no liability for any damage caused by any virus
>>> transmitted by this email. www.wipro.com
>>>
>>

Reply via email to