as lots
>of "Action" supporting it. So parallelism is dynamic, from job to job, or
>even from stage to stage.
>
>
> Yong
>
> --
> Date: Wed, 27 May 2015 15:48:57 +0800
> Subject: Re: How does spark manage the memory of executor
te: Wed, 27 May 2015 15:48:57 +0800
Subject: Re: How does spark manage the memory of executor with multiple tasks
From: ccn...@gmail.com
To: evo.efti...@isecc.com
CC: ar...@sigmoidanalytics.com; user@spark.apache.org
Does anyone can answer my question ? I am curious to know if there's multi
>
>
> Original message
> From: Arush Kharbanda
> Date:2015/05/26 10:55 (GMT+00:00)
> To: canan chen
> Cc: Evo Eftimov ,user@spark.apache.org
> Subject: Re: How does spark manage the memory of executor with multiple
> tasks
>
> Hi Evo,
>
> Worker is the
message From: Arush Kharbanda
Date:2015/05/26 10:55 (GMT+00:00)
To: canan chen Cc: Evo Eftimov
,user@spark.apache.org Subject: Re: How does
spark manage the memory of executor with multiple tasks
Hi Evo,
Worker is the JVM and an executor runs on the JVM. And after Spark 1.4 you
tional RAM e.g. for
>> new Object instances as there is available in the Executor aka JVM Heap
>>
>>
>>
>> *From:* canan chen [mailto:ccn...@gmail.com]
>> *Sent:* Tuesday, May 26, 2015 9:30 AM
>> *To:* Evo Eftimov
>> *Cc:* user@spark.apache.org
>> *Su
ances as there is available in the Executor aka JVM Heap
>
>
>
> *From:* canan chen [mailto:ccn...@gmail.com]
> *Sent:* Tuesday, May 26, 2015 9:30 AM
> *To:* Evo Eftimov
> *Cc:* user@spark.apache.org
> *Subject:* Re: How does spark manage the memory of executor with multipl
9:30 AM
To: Evo Eftimov
Cc: user@spark.apache.org
Subject: Re: How does spark manage the memory of executor with multiple tasks
Yes, I know that one task represent a JVM thread. This is what I confused.
Usually users want to specify the memory on task level, so how can I do it if
task if
dard
> concepts familiar to every Java, Scala etc developer
>
>
>
> *From:* canan chen [mailto:ccn...@gmail.com]
> *Sent:* Tuesday, May 26, 2015 9:02 AM
> *To:* user@spark.apache.org
> *Subject:* How does spark manage the memory of executor with multiple
> tasks
>
>
>
provided clear
dictionary/explanations linking these “inventions” with standard concepts
familiar to every Java, Scala etc developer
From: canan chen [mailto:ccn...@gmail.com]
Sent: Tuesday, May 26, 2015 9:02 AM
To: user@spark.apache.org
Subject: How does spark manage the memory of executor
Since spark can run multiple tasks in one executor, so I am curious to know
how does spark manage memory across these tasks. Say if one executor takes
1GB memory, then if this executor can run 10 tasks simultaneously, then
each task can consume 100MB on average. Do I understand it correctly ? It
do
10 matches
Mail list logo