I didn't notice your message and asked about the same question, in the
thread with the title "Spark job resource allocation best practices".

Adding specific case to your example:
1 - There are 12 cores available in the cluster
2 - I start app B with all cores - gets 12
3 - I start app A - it needs just 2 cores (as you said it will get even
when there are 12 available), but gets nothing
4 - Until I stop app B, app A is stuck waiting, instead of app B freeing 2
cores and dropping to 10 cores.

*Romi Kuntsman*, *Big Data Engineer*
 http://www.totango.com

On Mon, Nov 3, 2014 at 3:17 PM, RodrigoB <rodrigo.boav...@aspect.com> wrote:

> Hi all,
>
> I can't seem to find a clear answer on the documentation.
>
> Does the standalone cluster support dynamic assigment of nr of allocated
> cores to an application once another app stops?
> I'm aware that we can have core sharding if we use Mesos between active
> applications depending on the nr of parallel tasks I believe my question is
> slightly simpler.
>
> For example:
> 1 - There are 12 cores available in the cluster
> 2 - I start app A with 2 cores - gets 2
> 3 - I start app B - gets remaining 10
> 4 - If I stop app A, app B *does not* get the now available remaining 2
> cores.
>
> Should I expect Mesos to have this scenario working?
>
> Also, the same question applies to when we add more cores to a cluster.
> Let's say ideally I want 12 cores for my app, although there are only 10.
> As
> I add more workers, they should get assigned to my app dynamically. I
> haven't tested this in a while but I think the app will not even start and
> complain about not enough resources...
>
> Would very much appreciate any knowledge share on this!
>
> tnks,
> Rod
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Dynamically-switching-Nr-of-allocated-core-tp17955.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to