ming 2.1.0 and can confirm
> spark.dynamicAllocation.enabled is enough.
>
> Best Regards
> Richard
>
> From: Sourav Mazumder <sourav.mazumde...@gmail.com>
> Date: Sunday, December 3, 2017 at 12:31 PM
> To: user <user@spark.apache.org>
> Subject: Dynamic Re
mic Resource allocation in Spark Streaming
Hi,
I see the following jira is resolved in Spark 2.0
https://issues.apache.org/jira/browse/SPARK-12133 which is supposed to support
Dynamic Resource Allocation in Spark Streaming.
I also see the JiRA https://issues.apache.org/jira/browse/SPARK-22008 whi
Hi,
I see the following jira is resolved in Spark 2.0
https://issues.apache.org/jira/browse/SPARK-12133 which is supposed to
support Dynamic Resource Allocation in Spark Streaming.
I also see the JiRA https://issues.apache.org/jira/browse/SPARK-22008 which
is about fixing numer of executor
off <matthias.nieh...@codecentric.de>
>> Sent: 10/26/2015 4:00 PM
>> To: user@spark.apache.org
>> Subject: Dynamic Resource Allocation with Spark Streaming (Standalone
>> Cluster, Spark 1.5.1)
>>
>> Hello everybody,
>>
>> I have a few (~15) Spar
:00 PM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Dynamic Resource Allocation with Spark Streaming (Standalone Cluster,
Spark 1.5.1)
Hello everybody,
I have a few (~15) Spark Streaming jobs which have load peaks as well as long
times with a low load. So I thought the
Hello everybody,
I have a few (~15) Spark Streaming jobs which have load peaks as well as
long times with a low load. So I thought the new Dynamic Resource
Allocation for Standalone Clusters might be helpful (SPARK-4751).
I have a test "cluster" with 1 worker consisting of 4 executors with 2
ynamic resource allocation
> is not yet supported in streaming apps.
>
> Thanks,
> Silvio
>
> Sent from my Lumia 930
> --
> From: Matthias Niehoff <matthias.nieh...@codecentric.de>
> Sent: 10/26/2015 4:00 PM
> To: user@spark.apache.or
Well in spark, you can get the information that you need from the driver ui
running on port 4040, click on the active job, then click on the stages and
inside the stages you will find the tasks and the machine address on which
the task is being executed, you can also check the cpu load on that
I am not much clear about resource allocation (CPU/CORE/Thread level
allocation) as per the parallelism by setting number of cores in spark
standalone mode .
Any guidelines for that .
--
Thanks & Regards,
Anshu Shukla