[ 
https://issues.apache.org/jira/browse/SPARK-25982?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16782527#comment-16782527
 ] 

Ramandeep Singh commented on SPARK-25982:
-----------------------------------------

No, as I said those operations at a stage are independent. And I explicitly 
await for them to complete before launching the next stage. It's the fact that 
operation from next stage start running before all futures have completed. 

> Dataframe write is non blocking in fair scheduling mode
> -------------------------------------------------------
>
>                 Key: SPARK-25982
>                 URL: https://issues.apache.org/jira/browse/SPARK-25982
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.1
>            Reporter: Ramandeep Singh
>            Priority: Major
>
> Hi,
> I have noticed that expected behavior of dataframe write operation to block 
> is not working in fair scheduling mode.
> Ideally when a dataframe write is occurring and a future is blocking on 
> AwaitResult, no other job should be started, but this is not the case. I have 
> noticed that other jobs are started when the partitions are being written.  
>  
> Regards,
> Ramandeep Singh
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to