Strange, even i'm having it while running in local mode.

[image: Inline image 1]

I set it as .set("spark.executor.memory", "1g")

Thanks
Best Regards

On Mon, Mar 16, 2015 at 2:43 PM, Xi Shen <davidshe...@gmail.com> wrote:

> I set "spark.executor.memory" to "2048m". If the executor storage memory
> is 0.6 of executor memory, it should be 2g * 0.6 = 1.2g.
>
> My machine has 56GB memory, and 0.6 of that should be 33.6G...I hate math
> xD
>
>
> On Mon, Mar 16, 2015 at 7:59 PM Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> How much memory are you having on your machine? I think default value is
>> 0.6 of the spark.executor.memory as you can see from here
>> <http://spark.apache.org/docs/1.2.1/configuration.html#execution-behavior>
>> .
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Mar 16, 2015 at 2:26 PM, Xi Shen <davidshe...@gmail.com> wrote:
>>
>>> Hi Akhil,
>>>
>>> Yes, you are right. If I ran the program from IDE as a normal java
>>> program, the executor's memory is increased...but not to 2048m, it is set
>>> to 6.7GB...Looks like there's some formula to calculate this value.
>>>
>>>
>>> Thanks,
>>> David
>>>
>>>
>>> On Mon, Mar 16, 2015 at 7:36 PM Akhil Das <ak...@sigmoidanalytics.com>
>>> wrote:
>>>
>>>> By default spark.executor.memory is set to 512m, I'm assuming since you
>>>> are submiting the job using spark-submit and it is not able to override the
>>>> value since you are running in local mode. Can you try it without using
>>>> spark-submit as a standalone project?
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <davidshe...@gmail.com> wrote:
>>>>
>>>>> I set it in code, not by configuration. I submit my jar file to local.
>>>>> I am working in my developer environment.
>>>>>
>>>>> On Mon, 16 Mar 2015 18:28 Akhil Das <ak...@sigmoidanalytics.com>
>>>>> wrote:
>>>>>
>>>>>> How are you setting it? and how are you submitting the job?
>>>>>>
>>>>>> Thanks
>>>>>> Best Regards
>>>>>>
>>>>>> On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <davidshe...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I have set spark.executor.memory to 2048m, and in the UI
>>>>>>> "Environment" page, I can see this value has been set correctly. But in 
>>>>>>> the
>>>>>>> "Executors" page, I saw there's only 1 executor and its memory is 
>>>>>>> 265.4MB.
>>>>>>> Very strange value. why not 256MB, or just as what I set?
>>>>>>>
>>>>>>> What am I missing here?
>>>>>>>
>>>>>>>
>>>>>>> Thanks,
>>>>>>> David
>>>>>>>
>>>>>>>
>>>>>>
>>>>
>>

Reply via email to