> I think Rostyslav is using a DFS which logs at warn/error if you try to
delete a directory that isn't there, so is seeing warning messages that
nobody else does
Yep, you are correct.
> Rostyslav —like I said, i'd be curious as to which DFS/object store you
are working with
Unfortunately, I am
I think Rostyslav is using a DFS which logs at warn/error if you try to delete
a directory that isn't there, so is seeing warning messages that nobody else
does
Rostyslav —like I said, i'd be curious as to which DFS/object store you are
working with, as it is behaving slightly differently from
On 16 Jan 2017, at 12:51, Rostyslav Sotnychenko
> wrote:
Thanks all!
I was using another DFS instead of HDFS, which was logging an error when
fs.delete got called on non-existing path.
really? Whose DFS, if you don't mind me asking?
Thanks all!
I was using another DFS instead of HDFS, which was logging an error when
fs.delete got called on non-existing path.
In Spark 2.0.1 which I was using previously, everything was working fine
because existence of an additional check that was made prior to deleting.
However that check got
Hi,
Will it be a problem if the staging directory is already deleted? Because
even the directory doesn't exist, fs.delete(stagingDirPath, true) won't
cause failure but just return false.
Rostyslav Sotnychenko wrote
> Hi all!
>
> I am a bit confused why Spark AM and Client are both trying to
scala> org.apache.hadoop.fs.FileSystem.getLocal(sc.hadoopConfiguration)
res0: org.apache.hadoop.fs.LocalFileSystem =
org.apache.hadoop.fs.LocalFileSystem@3f84970b
scala> res0.delete(new org.apache.hadoop.fs.Path("/tmp/does-not-exist"), true)
res3: Boolean = false
Does that explain your
Are you actually seeing a problem or just questioning the code?
I have never seen a situation where there's a failure because of that
part of the current code.
On Fri, Jan 13, 2017 at 3:24 AM, Rostyslav Sotnychenko
wrote:
> Hi all!
>
> I am a bit confused why Spark AM
Hi all!
I am a bit confused why Spark AM and Client are both trying to delete
Staging Directory.
https://github.com/apache/spark/blob/branch-2.1/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1110
Hi all!
I am a bit confused why Spark AM and Client are both trying to delete
Staging Directory.
https://github.com/apache/spark/blob/branch-2.1/yarn/
src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1110
https://github.com/apache/spark/blob/branch-2.1/yarn/