meiyoula created SPARK-8366:
-------------------------------
Summary: When task fails and append a new one, the
ExecutorAllocationManager can't sense the new tasks
Key: SPARK-8366
URL: https://issues.apache.org/jira/browse/SPARK-8366
Project: Spark
Issue Type: Bug
Components: Spark Core
Reporter: meiyoula
I use the *dynamic executor allocation* function. Then one executor is killed,
all the tasks on it are failed. When the new tasks are appended, the new
executor won't added.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]