Erik Krogen created SPARK-33214:
-----------------------------------

             Summary: HiveExternalCatalogVersionsSuite shouldn't use or delete 
hard-coded /tmp directory 
                 Key: SPARK-33214
                 URL: https://issues.apache.org/jira/browse/SPARK-33214
             Project: Spark
          Issue Type: Bug
          Components: SQL, Tests
    Affects Versions: 3.0.1
            Reporter: Erik Krogen


In SPARK-22356, the {{sparkTestingDir}} used by 
{{HiveExternalCatalogVersionsSuite}} became hard-coded to enable re-use of the 
downloaded Spark tarball between test executions:
{code}
  // For local test, you can set `sparkTestingDir` to a static value like 
`/tmp/test-spark`, to
  // avoid downloading Spark of different versions in each run.
  private val sparkTestingDir = new File("/tmp/test-spark")
{code}
However this doesn't work, since it gets deleted every time:
{code}
  override def afterAll(): Unit = {
    try {
      Utils.deleteRecursively(wareHousePath)
      Utils.deleteRecursively(tmpDataDir)
      Utils.deleteRecursively(sparkTestingDir)
    } finally {
      super.afterAll()
    }
  }
{code}

It's bad that we're hard-coding to a {{/tmp}} directory, as in some cases this 
is not the proper place to store temporary files. We're not currently making 
any good use of it.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to