[ https://issues.apache.org/jira/browse/SPARK-38492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Haejoon Lee updated SPARK-38492: -------------------------------- Description: Currently, PySpark test coverage is around 91% according to codecov report: [https://app.codecov.io/gh/apache/spark.|https://app.codecov.io/gh/apache/spark).] Since there are still 9% missing tests, so I think it would be great to improve our test coverage. Of course we might not target to 100%, but as much as possible, to the level that we can currently cover with CI. was: Currently, PySpark test coverage is around 91% according to codecov report ([https://app.codecov.io/gh/apache/spark).] Since there are still 9% missing tests, so I think it would be great to improve our test coverage. Of course we might not target to 100%, but as much as possible, to the level that we can currently cover with CI. > Improve the test coverage for PySpark > ------------------------------------- > > Key: SPARK-38492 > URL: https://issues.apache.org/jira/browse/SPARK-38492 > Project: Spark > Issue Type: Umbrella > Components: PySpark, Tests > Affects Versions: 3.3.0 > Reporter: Haejoon Lee > Priority: Major > > Currently, PySpark test coverage is around 91% according to codecov report: > [https://app.codecov.io/gh/apache/spark.|https://app.codecov.io/gh/apache/spark).] > Since there are still 9% missing tests, so I think it would be great to > improve our test coverage. > Of course we might not target to 100%, but as much as possible, to the level > that we can currently cover with CI. -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org