[ 
https://issues.apache.org/jira/browse/SPARK-13433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15157802#comment-15157802
 ] 

lichenglin commented on SPARK-13433:
------------------------------------

I know the property 'spark.dirver.cores'

What I want to limit is the total  count , not the count of single driver.

So What should I do if the drivers have used all the cores???

There is no core for app ,and then driver will never stop and free their 
resources.Is it correct?

> The standalone   server should limit the count of cores and memory for 
> running Drivers
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-13433
>                 URL: https://issues.apache.org/jira/browse/SPARK-13433
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>    Affects Versions: 1.6.0
>            Reporter: lichenglin
>
> I have a 16 cores cluster.
> A  Running driver at least use 1 core may be more.
> When I submit a lot of job to the standalone  server in cluster mode.
> all the cores may be used for running driver,
> and then there is no cores to run applications
> The server is stuck.
> So I think we should limit the resources(cores and memory) for running driver.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to