[ 
https://issues.apache.org/jira/browse/SPARK-13902?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15257721#comment-15257721
 ] 

Takuya Ueshin commented on SPARK-13902:
---------------------------------------

I found that the example above does not reproduce the issue.

We need some more condition to reproduce this.

- We need at least one more stage before the RDD A for shuffle dependency to 
duplicate.
- We need to reach the duplicating shuffle dependency earlier than its 
descendant like \[B\]\-\[C\]\-\[D\] or \[C\]\-\[D\]\-\[E\] in my example to 
collect shuffle id in wrong order.
- We need at least one more stage after the RDD C as a result stage.
The method {{getAncestorShuffleDependencies}} is used only while finding 
ancestors of {{ShuffleMapStage}}, not while finding ancestors of 
{{ResultStage}}.

So the simpler example to reproduce the issue is as follows:

{noformat}
[A] <--(s_A)-- [B] <--(s_B)-- [C] <--(s_C)-- [D] <--(s_D)-- [E]
                  \                         /
                   <-----------------------
{noformat}

Here, RDDs A to E have shuffle dependency on the each previous RDD, and RDD D 
also has one-to-one dependency on RDD B, i.e. RDD D have a shuffle dependency 
on RDD C and one-to-one dependency on RDD B.

But the behavior of {{getAncestorShuffleDependencies}} is slightly different as 
[~kayousterhout] described.

It returns the shuffle ids s_B, s_A, s_C but while creating the stages for the 
ids, s_A is duplicated at 
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala#L289.

At first  it creates a stage for s_B but parent stage for s_A is also created 
in the {{newOrUsedShuffleStage}} method and then another stage for s_A is 
created again here.
The stage for s_B refers _old_ stage for s_A as its parent and DAGScheduler 
manages the _new_ stage for s_A.

> Make DAGScheduler.getAncestorShuffleDependencies() return in topological 
> order to ensure building ancestor stages first.
> ------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-13902
>                 URL: https://issues.apache.org/jira/browse/SPARK-13902
>             Project: Spark
>          Issue Type: Bug
>          Components: Scheduler
>            Reporter: Takuya Ueshin
>
> {{DAGScheduler}} sometimes generate incorrect stage graph.
> Some stages are generated for the same shuffleId twice or more and they are 
> referenced by the child stages because the building order of the graph is not 
> correct.
> Here, we submit an RDD\[F\] having a linage of RDDs as follows (please see 
> this in {{monospaced}} font):
> {noformat}
>                               <--------------------
>                             /                       \
> [A] <--(1)-- [B] <--(2)-- [C] <--(3)-- [D] <--(4)-- [E] <--(5)-- [F]
>                \                       /
>                  <--------------------
> {noformat}
> Note: \[\] means an RDD, () means a shuffle dependency.
> {{DAGScheduler}} generates the following stages and their parents for each 
> shuffle:
> |  | stage | parents |
> | (1) | ShuffleMapStage 2 | List() |
> | (2) | ShuffleMapStage 1 | List(ShuffleMapStage 0) |
> | (3) | ShuffleMapStage 3 | List(ShuffleMapStage 1) |
> | (4) | ShuffleMapStage 4 | List(ShuffleMapStage 2, ShuffleMapStage 3) |
> | (5) | ShuffleMapStage 5 | List(ShuffleMapStage 1, ShuffleMapStage 4) |
> | \- | ResultStage 6 | List(ShuffleMapStage 5) |
> The stage for shuffle id {{0}} should be {{ShuffleMapStage 0}}, but the stage 
> for shuffle id {{0}} is generated twice as {{ShuffleMapStage 2}} and 
> {{ShuffleMapStage 0}} is overwritten by {{ShuffleMapStage 2}}, and the stage 
> {{ShuffleMap Stage1}} keeps referring the _old_ stage {{ShuffleMapStage 0}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to