[ https://issues.apache.org/jira/browse/SPARK-20689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Felix Cheung updated SPARK-20689: --------------------------------- Description: When trying to address build test failure in SPARK-20661 we discovered some tables are unexpectedly left behind causing R tests to fail. While we changed the R tests to be more resilient, we investigated further to see what was creating those tables. It turns out pyspark doctest is calling saveAsTable without ever dropping them. Since we have separate python tests for bucketed table, and there is not testing of results, there is really no need to run the doctest was: When trying to address build test failure in SPARK-20661 we discovered some tables are unexpectedly left behind causing R tests to fail. While we changed the R tests to be more resilient, we investigated further to see what was creating those tables. It turns out pyspark doctest is calling saveAsTable without ever dropping them. Since we have separately python tests for bucketed table, and there is not testing of results, there is really no need to run the doctest > python doctest leaking bucketed table > ------------------------------------- > > Key: SPARK-20689 > URL: https://issues.apache.org/jira/browse/SPARK-20689 > Project: Spark > Issue Type: Bug > Components: PySpark, SQL > Affects Versions: 2.3.0 > Reporter: Felix Cheung > Assignee: Felix Cheung > > When trying to address build test failure in SPARK-20661 we discovered some > tables are unexpectedly left behind causing R tests to fail. While we changed > the R tests to be more resilient, we investigated further to see what was > creating those tables. > It turns out pyspark doctest is calling saveAsTable without ever dropping > them. Since we have separate python tests for bucketed table, and there is > not testing of results, there is really no need to run the doctest -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org