Hi,

I've written a short scala app to perform word counts on a text file and am
getting the following exception as the program completes (after it prints
out all of the word counts).

Exception in thread "delete Spark temp dir
C:\Users\Josh\AppData\Local\Temp\spark-0fdd0b79-7329-4690-a093-0fdb0d21e32c"
java.io.IOException: Failed to delete:
C:\Users\Josh\AppData\Local\Temp\spark-0fdd0b79-7329-4690-a093-0fdb0d21e32c\word-count_2.10-1.0.jar
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:692)
        at
org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:686)
        at
org.apache.spark.util.Utils$$anonfun$deleteRecursively$1.apply(Utils.scala:685)
        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at
scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:685)
        at org.apache.spark.util.Utils$$anon$4.run(Utils.scala:281)

A few details: Spark version 1.1.0 built from source on Windows 8. Scala
version 2.11.4, JRE version 7 Update 71. Everything is local on a single
machine. I can run the code in the scala shell just fine, this appears to be
an issue with not being able to delete the temporary JAR file after running
the application via spark-submit. I've checked the location of the JAR file
and it is indeed left behind, so it is not being deleted. I guess my
question is why is it unable to be deleted and is there anything I can do
differently to fix this?

I do not see this exception when I run the SimpleApp example taken from
here:
http://spark.apache.org/docs/latest/quick-start.html#standalone-applications

Here is the code for my WordCount app that is producing this exception:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object WordCount {
        def main(args: Array[String]) {
                val filename = "data/rideshare.txt"
                val conf = new SparkConf().setAppName("Word 
Count").setMaster("local")
        val sc = new SparkContext(conf)
                val file = sc.textFile(filename, 2).cache()
                val wordCounts = file.flatMap(line => line.split(" ")).map(word 
=> (word,
1)).reduceByKey((a, b) => a + b)
                wordCounts.collect().foreach(println)
        }
}

Thanks,
Josh



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Deleting-temp-dir-Exception-tp18006.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to