There are a number of small misunderstandings here.

In the first instance, the executor memory is not actually being set
to 2g and the default of 512m is being used. If you are writing code
to launch an app, then you are trying to duplicate what spark-submit
does, and you don't use spark-submit. If you use spark-submit, your
configuration happens "too late".

The memory you see in the UI is not total executor memory. it is
memory available for caching. The default formula is actually 0.6 *
0.9 * total, not 0.6 * total.

This is not a function of your machines total memory, but of the
configured executor memory.

if this value is 6.7GB it implies that you somehow configured the
executors to use 12.4GB of memory. Double-check for typos and maybe
confirm what figure you are quoting here.

In the last instance -- you are looking at driver memory, not executor
memory. The 1g you are trying to configure affects executors.

On Mon, Mar 16, 2015 at 9:21 AM, Akhil Das <ak...@sigmoidanalytics.com> wrote:
> Strange, even i'm having it while running in local mode.
>
>
>
> I set it as .set("spark.executor.memory", "1g")
>
> Thanks
> Best Regards
>
> On Mon, Mar 16, 2015 at 2:43 PM, Xi Shen <davidshe...@gmail.com> wrote:
>>
>> I set "spark.executor.memory" to "2048m". If the executor storage memory
>> is 0.6 of executor memory, it should be 2g * 0.6 = 1.2g.
>>
>> My machine has 56GB memory, and 0.6 of that should be 33.6G...I hate math
>> xD
>>
>>
>> On Mon, Mar 16, 2015 at 7:59 PM Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>>
>>> How much memory are you having on your machine? I think default value is
>>> 0.6 of the spark.executor.memory as you can see from here.
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen <davidshe...@gmail.com> wrote:
>>>>
>>>> Hi Akhil,
>>>>
>>>> Yes, you are right. If I ran the program from IDE as a normal java
>>>> program, the executor's memory is increased...but not to 2048m, it is set 
>>>> to
>>>> 6.7GB...Looks like there's some formula to calculate this value.
>>>>
>>>>
>>>> Thanks,
>>>> David
>>>>
>>>>
>>>> On Mon, Mar 16, 2015 at 7:36 PM Akhil Das <ak...@sigmoidanalytics.com>
>>>> wrote:
>>>>>
>>>>> By default spark.executor.memory is set to 512m, I'm assuming since you
>>>>> are submiting the job using spark-submit and it is not able to override 
>>>>> the
>>>>> value since you are running in local mode. Can you try it without using
>>>>> spark-submit as a standalone project?
>>>>>
>>>>> Thanks
>>>>> Best Regards
>>>>>
>>>>> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <davidshe...@gmail.com> wrote:
>>>>>>
>>>>>> I set it in code, not by configuration. I submit my jar file to local.
>>>>>> I am working in my developer environment.
>>>>>>
>>>>>>
>>>>>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com>
>>>>>> wrote:
>>>>>>>
>>>>>>> How are you setting it? and how are you submitting the job?
>>>>>>>
>>>>>>> Thanks
>>>>>>> Best Regards
>>>>>>>
>>>>>>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <davidshe...@gmail.com>
>>>>>>> wrote:
>>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> I have set spark.executor.memory to 2048m, and in the UI
>>>>>>>> "Environment" page, I can see this value has been set correctly. But 
>>>>>>>> in the
>>>>>>>> "Executors" page, I saw there's only 1 executor and its memory is 
>>>>>>>> 265.4MB.
>>>>>>>> Very strange value. why not 256MB, or just as what I set?
>>>>>>>>
>>>>>>>> What am I missing here?
>>>>>>>>
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> David
>>>>>>>>
>>>>>>>
>>>>>
>>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to