Hi, SK 

For 1.0.0 you have to delete it manually

in 1.0.1 there will be a parameter to enable overwriting 

https://github.com/apache/spark/pull/947/files

Best, 

-- 
Nan Zhu


On Thursday, June 12, 2014 at 1:57 PM, SK wrote:

> Hi,
> 
> When we have multiple runs of a program writing to the same output file, the
> execution fails if the output directory already exists from a previous run.
> Is there some way we can have it overwrite the existing directory, so that
> we dont have to manually delete it after each run?
> 
> Thanks for your help.
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/overwriting-output-directory-tp7498.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com 
> (http://Nabble.com).
> 
> 


Reply via email to