You can add it in in conf/spark-defaults.conf

 # spark.executor.extraJavaOptions  -XX:+PrintGCDetails

FYI

On Fri, Oct 9, 2015 at 3:07 AM, Ramkumar V <ramkumar.c...@gmail.com> wrote:

> How to increase the Xmx of the workers ?
>
> *Thanks*,
> <https://in.linkedin.com/in/ramkumarcs31>
>
>
> On Mon, Oct 5, 2015 at 3:48 PM, Ramkumar V <ramkumar.c...@gmail.com>
> wrote:
>
>> No. I didn't try to increase xmx.
>>
>> *Thanks*,
>> <https://in.linkedin.com/in/ramkumarcs31>
>>
>>
>> On Mon, Oct 5, 2015 at 1:36 PM, Jean-Baptiste Onofré <j...@nanthrax.net>
>> wrote:
>>
>>> Hi Ramkumar,
>>>
>>> did you try to increase Xmx of the workers ?
>>>
>>> Regards
>>> JB
>>>
>>> On 10/05/2015 08:56 AM, Ramkumar V wrote:
>>>
>>>> Hi,
>>>>
>>>> When i submit java spark job in cluster mode, i'm getting following
>>>> exception.
>>>>
>>>> *LOG TRACE :*
>>>>
>>>> INFO yarn.ExecutorRunnable: Setting up executor with commands:
>>>> List({{JAVA_HOME}}/bin/java, -server, -XX:OnOutOfMemoryError='kill
>>>>   %p', -Xms1024m, -Xmx1024m, -Djava.io.tmpdir={{PWD}}/tmp,
>>>> '-Dspark.ui.port=0', '-Dspark.driver.port=48309',
>>>> -Dspark.yarn.app.container.log.dir=<LOG
>>>> _DIR>, org.apache.spark.executor.CoarseGrainedExecutorBackend,
>>>> --driver-url, akka.tcp://sparkDriver@ip
>>>> :port/user/CoarseGrainedScheduler,
>>>>   --executor-id, 2, --hostname, hostname , --cores, 1, --app-id,
>>>> application_1441965028669_9009, --user-class-path, file:$PWD
>>>> /__app__.jar, --user-class-path, file:$PWD/json-20090211.jar, 1>,
>>>> <LOG_DIR>/stdout, 2>, <LOG_DIR>/stderr).
>>>>
>>>> I have a cluster of 11 machines (9 - 64 GB memory and 2 - 32 GB memory
>>>> ). my input data of size 128 GB.
>>>>
>>>> How to solve this exception ? is it depends on driver.memory and
>>>> execuitor.memory setting ?
>>>>
>>>>
>>>> *Thanks*,
>>>> <https://in.linkedin.com/in/ramkumarcs31>
>>>>
>>>>
>>> --
>>> Jean-Baptiste Onofré
>>> jbono...@apache.org
>>> http://blog.nanthrax.net
>>> Talend - http://www.talend.com
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to