[GitHub] spark issue #14894: [SPARK-17330] [SPARK UT] Clean up spark-warehouse in UT

2016-09-08 Thread tone-zhang
Github user tone-zhang commented on the issue: https://github.com/apache/spark/pull/14894 Update the PR to clean up the "spark-warehouse" in DDLSuite.scala. @srowen I have run the whole Spark UT with the PR, I cannot see "spark/spark-warehoues" after the test.

[GitHub] spark issue #14894: [SPARK-17330] [SPARK UT] Clean up spark-warehouse in UT

2016-09-07 Thread tone-zhang
Github user tone-zhang commented on the issue: https://github.com/apache/spark/pull/14894 @srowen I checked DDLSuite.scala, the case "Create Database using Default Warehouse Path" is used to test the default "spark-warehouse" setting. In the doc, the de

[GitHub] spark issue #14894: [SPARK-17330] [SPARK UT] Clean up spark-warehouse in UT

2016-09-06 Thread tone-zhang
Github user tone-zhang commented on the issue: https://github.com/apache/spark/pull/14894 In Spark UT, the temporary data file should be stored in spark/target/tmp not in spark/spark-warehouse. The Spark UT suite will clean up the path spark/target/tmp when finish the test. It is the

[GitHub] spark pull request #14894: [SPARK-17330] [SPARK UT] Clean up spark-warehouse...

2016-09-04 Thread tone-zhang
Github user tone-zhang commented on a diff in the pull request: https://github.com/apache/spark/pull/14894#discussion_r77448364 --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala --- @@ -109,6 +109,8 @@ class HiveSparkSubmitSuite

[GitHub] spark issue #14894: [SPARK-17330] [SPARK UT] Clean up spark-warehouse in UT

2016-09-01 Thread tone-zhang
Github user tone-zhang commented on the issue: https://github.com/apache/spark/pull/14894 @srowen Thanks a lot for your help. I have updated the code and used the Utils.deleteRecursively, the Utils method is very helpful. Thanks! For the time being, I found the UT case is the

[GitHub] spark issue #14894: [SPARK-17330] [SPARK UT] Fix the failure Spark UT (SPARK...

2016-08-31 Thread tone-zhang
Github user tone-zhang commented on the issue: https://github.com/apache/spark/pull/14894 @srowen Thanks for the comments. I wrote SPARK-8368 here just because the UT case name is "SPARK-8368: includes jars passed in through --jars". For the Utils method you mentio

[GitHub] spark pull request #14894: [SPARK-17330] [SPARK UT] Fix the failure Spark UT...

2016-08-31 Thread tone-zhang
GitHub user tone-zhang opened a pull request: https://github.com/apache/spark/pull/14894 [SPARK-17330] [SPARK UT] Fix the failure Spark UT (SPARK-8368) case ## What changes were proposed in this pull request? Check the database warehouse used in Spark UT, and remove the