[ https://issues.apache.org/jira/browse/SPARK-36560?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-36560: --------------------------------- Priority: Minor (was: Major) > Deflake PySpark coverage report > ------------------------------- > > Key: SPARK-36560 > URL: https://issues.apache.org/jira/browse/SPARK-36560 > Project: Spark > Issue Type: Improvement > Components: Project Infra, PySpark > Affects Versions: 3.2.0 > Reporter: Hyukjin Kwon > Priority: Minor > > https://github.com/apache/spark/runs/3388727798?check_suite_focus=true > https://github.com/apache/spark/runs/3392972609?check_suite_focus=true > https://github.com/apache/spark/runs/3359880048?check_suite_focus=true > https://github.com/apache/spark/runs/3338876122?check_suite_focus=true > PySpark scheduled coverage jobs are flaky. We should deflake them -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org