Hi, I also encounter the same problem when I run locally. But when I run on 
cluster, everything is fine. Then I run locally again without the jars 
parameter, the exception disappears.

Best regards,
Chen jingci
--sent from phone, sorry for the typo

-----Original Message-----
From: "goi cto" <goi....@gmail.com>
Sent: ‎4/‎3/‎2014 17:55
To: "user@spark.apache.org" <user@spark.apache.org>
Subject: Re: Problem with "delete spark temp dir" on spark 0.8.1

Exception in thread "delete Spark temp dir C:\Users\..." java.io.IOException: 
failed to delete: C:\Users\...\simple-project-1.0.jar"
 at org.apache.spark.util.utils$.deleteRecursively(Utils.scala:495)
 at 
org.apache.spark.util.utils$$anonfun$deleteRecursively$1.apply(Utils.scala:491)



I deleted my temp dir as suggested and indeed all spark.. directories were 
deleted. after which I run the program again and got the same error again. also 
indeed a spark-... directory with the "simple-project-1.0.jar" was found left 
on the file system. 
I had no problem deleting this file once the program completed. 


Eran



On Tue, Mar 4, 2014 at 11:36 AM, Akhil Das <ak...@mobipulse.in> wrote:

Hi,


Try to clean your temp dir, System.getProperty("java.io.tmpdir")


Also, Can you paste a longer stacktrace?








Thanks
Best Regards



On Tue, Mar 4, 2014 at 2:55 PM, goi cto <goi....@gmail.com> wrote:

Hi,


I am running a spark java program on a local machine. when I try to write the 
output to a file (RDD.SaveAsTextFile) I am getting this exception:


Exception in thread "Delete Spark temp dir ..."


This is running on my local window machine.


Any ideas?



-- 

Eran | CTO 








-- 

Thanks
Best Regards





-- 

Eran | CTO 

Reply via email to