Github user tone-zhang commented on the issue:
https://github.com/apache/spark/pull/14894
Update the PR to clean up the "spark-warehouse" in DDLSuite.scala.
@srowen I have run the whole Spark UT with the PR, I cannot see
"spark/spark-warehoues" after the test.
Github user tone-zhang commented on the issue:
https://github.com/apache/spark/pull/14894
@srowen I checked DDLSuite.scala, the case "Create Database using Default
Warehouse Path" is used to test the default "spark-warehouse" setting. In the
doc, the de
Github user tone-zhang commented on the issue:
https://github.com/apache/spark/pull/14894
In Spark UT, the temporary data file should be stored in spark/target/tmp
not in spark/spark-warehouse. The Spark UT suite will clean up the path
spark/target/tmp when finish the test. It is the
Github user tone-zhang commented on a diff in the pull request:
https://github.com/apache/spark/pull/14894#discussion_r77448364
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala ---
@@ -109,6 +109,8 @@ class HiveSparkSubmitSuite
Github user tone-zhang commented on the issue:
https://github.com/apache/spark/pull/14894
@srowen Thanks a lot for your help.
I have updated the code and used the Utils.deleteRecursively, the Utils
method is very helpful. Thanks!
For the time being, I found the UT case is the
Github user tone-zhang commented on the issue:
https://github.com/apache/spark/pull/14894
@srowen Thanks for the comments.
I wrote SPARK-8368 here just because the UT case name is "SPARK-8368:
includes jars passed in through --jars".
For the Utils method you mentio
GitHub user tone-zhang opened a pull request:
https://github.com/apache/spark/pull/14894
[SPARK-17330] [SPARK UT] Fix the failure Spark UT (SPARK-8368) case
## What changes were proposed in this pull request?
Check the database warehouse used in Spark UT, and remove the