[ 
https://issues.apache.org/jira/browse/SPARK-12216?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16432207#comment-16432207
 ] 

Kingsley Jones edited comment on SPARK-12216 at 1/17/19 4:56 AM:
-----------------------------------------------------------------

 
{code:java}
scala> val loader = Thread.currentThread.getContextClassLoader()
 loader: ClassLoader = 
scala.tools.nsc.interpreter.IMain$TranslatingClassLoader@3a1a20f
scala> val parent1 = loader.getParent()
 parent1: ClassLoader = 
scala.reflect.internal.util.ScalaClassLoader$URLClassLoader@66e6af49
scala> val parent2 = parent1.getParent()
 parent2: ClassLoader = sun.misc.Launcher$AppClassLoader@5fcfe4b2
scala> val parent3 = parent2.getParent()
 parent3: ClassLoader = sun.misc.Launcher$ExtClassLoader@5257226b
scala> val parent4 = parent3.getParent()
 parent4: ClassLoader = null
{code}
 

I did experiment with trying to find the open ClassLoaders in the scala session 
(shown above).

<period><TAB> shows exposed methods on the loaders, but there is no close 
method:

 
{code:java}
scala> loader.
 clearAssertionStatus getResource getResources setClassAssertionStatus 
setPackageAssertionStatus
 getParent getResourceAsStream loadClass setDefaultAssertionStatus
scala> parent1.
 clearAssertionStatus getResource getResources setClassAssertionStatus 
setPackageAssertionStatus
 getParent getResourceAsStream loadClass setDefaultAssertionStatus
{code}
 

There is no close method on any of these, so I could not try closing them prior 
to quitting the session.

This was just a simple hack to see if there was any way to use reflection to 
find the open ClassLoaders.

I thought perhaps it might be possible to walk this tree and then close them 
within ShutDownHookManager ???


was (Author: kingsley):
 
{code:java}
scala> val loader = Thread.currentThread.getContextClassLoader()
 loader: ClassLoader = 
scala.tools.nsc.interpreter.IMain$TranslatingClassLoader@3a1a20f
scala> val parent1 = loader.getParent()
 parent1: ClassLoader = 
scala.reflect.internal.util.ScalaClassLoader$URLClassLoader@66e6af49
scala> val parent2 = parent1.getParent()
 parent2: ClassLoader = sun.misc.Launcher$AppClassLoader@5fcfe4b2
scala> val parent3 = parent2.getParent()
 parent3: ClassLoader = sun.misc.Launcher$ExtClassLoader@5257226b
scala> val parent4 = parent3.getParent()
 parent4: ClassLoader = null
{code}
 

I did experiment with trying to find the open ClassLoaders in the scala session 
(shown above).

<period><TAB> shows exposed methods on the loaders, but there is no close 
method:

scala> loader.
 clearAssertionStatus getResource getResources setClassAssertionStatus 
setPackageAssertionStatus
 getParent getResourceAsStream loadClass setDefaultAssertionStatus

scala> parent1.
 clearAssertionStatus getResource getResources setClassAssertionStatus 
setPackageAssertionStatus
 getParent getResourceAsStream loadClass setDefaultAssertionStatus

There is no close method on any of these, so I could not try closing them prior 
to quitting the session.

This was just a simple hack to see if there was any way to use reflection to 
find the open ClassLoaders.

I thought perhaps it might be possible to walk this tree and then close them 
within ShutDownHookManager ???

> Spark failed to delete temp directory 
> --------------------------------------
>
>                 Key: SPARK-12216
>                 URL: https://issues.apache.org/jira/browse/SPARK-12216
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>         Environment: windows 7 64 bit
> Spark 1.52
> Java 1.8.0.65
> PATH includes:
> C:\Users\Stefan\spark-1.5.2-bin-hadoop2.6\bin
> C:\ProgramData\Oracle\Java\javapath
> C:\Users\Stefan\scala\bin
> SYSTEM variables set are:
> JAVA_HOME=C:\Program Files\Java\jre1.8.0_65
> HADOOP_HOME=C:\Users\Stefan\hadoop-2.6.0\bin
> (where the bin\winutils resides)
> both \tmp and \tmp\hive have permissions
> drwxrwxrwx as detected by winutils ls
>            Reporter: stefan
>            Priority: Minor
>
> The mailing list archives have no obvious solution to this:
> scala> :q
> Stopping spark context.
> 15/12/08 16:24:22 ERROR ShutdownHookManager: Exception while deleting Spark 
> temp dir: 
> C:\Users\Stefan\AppData\Local\Temp\spark-18f2a418-e02f-458b-8325-60642868fdff
> java.io.IOException: Failed to delete: 
> C:\Users\Stefan\AppData\Local\Temp\spark-18f2a418-e02f-458b-8325-60642868fdff
>         at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:884)
>         at 
> org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:63)
>         at 
> org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:60)
>         at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
>         at 
> org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:60)
>         at 
> org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:264)
>         at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:234)
>         at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
>         at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
>         at 
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
>         at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:234)
>         at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
>         at 
> org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
>         at scala.util.Try$.apply(Try.scala:161)
>         at 
> org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:234)
>         at 
> org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:216)
>         at 
> org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to