[ https://issues.apache.org/jira/browse/SPARK-8333?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17056710#comment-17056710 ]
Saeed Hassanvand edited comment on SPARK-8333 at 3/11/20, 7:15 AM: ------------------------------------------------------------------- Hi, It seems that this bug still exists! I encountered this issue in javaSparkContext, not hiveContext. I using spark-submit to run a simple spark example ([https://spark.apache.org/docs/latest/quick-start.html]) from the official spark documentation page ([https://spark.apache.org/docs/latest/quick-start.html]]. $HADOOP_HOME: C:\winutils\bin\winutils.exe Spark Version: spark-2.4.5-bin-hadoop2.7 Windows 10 Tnx. was (Author: saeedhassanvand): Hi, Its seems that this bug still exists! I encountered this issue in javaSparkContext, not hiveContext. I using spark-submit to run a simple spark example ([https://spark.apache.org/docs/latest/quick-start.html]) from the official spark documentation page ([https://spark.apache.org/docs/latest/quick-start.html]]. $HADOOP_HOME: C:\winutils\bin\winutils.exe Spark Version: spark-2.4.5-bin-hadoop2.7 Windows 10 Tnx. > Spark failed to delete temp directory created by HiveContext > ------------------------------------------------------------ > > Key: SPARK-8333 > URL: https://issues.apache.org/jira/browse/SPARK-8333 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.4.0 > Environment: Windows7 64bit > Reporter: sheng > Priority: Minor > Labels: Hive, bulk-closed, metastore, sparksql > Attachments: test.tar > > > Spark 1.4.0 failed to stop SparkContext. > {code:title=LocalHiveTest.scala|borderStyle=solid} > val sc = new SparkContext("local", "local-hive-test", new SparkConf()) > val hc = Utils.createHiveContext(sc) > ... // execute some HiveQL statements > sc.stop() > {code} > sc.stop() failed to execute, it threw the following exception: > {quote} > 15/06/13 03:19:06 INFO Utils: Shutdown hook called > 15/06/13 03:19:06 INFO Utils: Deleting directory > C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea > 15/06/13 03:19:06 ERROR Utils: Exception while deleting Spark temp dir: > C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea > java.io.IOException: Failed to delete: > C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea > at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:963) > at > org.apache.spark.util.Utils$$anonfun$1$$anonfun$apply$mcV$sp$5.apply(Utils.scala:204) > at > org.apache.spark.util.Utils$$anonfun$1$$anonfun$apply$mcV$sp$5.apply(Utils.scala:201) > at scala.collection.mutable.HashSet.foreach(HashSet.scala:79) > at org.apache.spark.util.Utils$$anonfun$1.apply$mcV$sp(Utils.scala:201) > at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2292) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(Utils.scala:2262) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2262) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2262) > at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2262) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2262) > at > org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2262) > at scala.util.Try$.apply(Try.scala:161) > at > org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2262) > at > org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2244) > at > org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54) > {quote} > It seems this bug is introduced by this SPARK-6907. In SPARK-6907, a local > hive metastore is created in a temp directory. The problem is the local hive > metastore is not shut down correctly. At the end of application, if > SparkContext.stop() is called, it tries to delete the temp directory which is > still used by the local hive metastore, and throws an exception. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org