You will need to use the HDFS API to do that.
Try something like:
val conf = sc.hadoopConfiguration
val fs = org.apache.hadoop.fs.FileSystem.get(conf)
fs.rename(new org.apache.hadoop.fs.Path("/path/on/hdfs/file.txt"), new
org.apache.hadoop.fs.Path("/path/on/hdfs/other/file.txt"))
Full API for
For some file on hdfs, it is necessary to copy/move it to some another specific
hdfs directory, and the directory name would keep unchanged.Just need finish
it in spark program, but not hdfs commands.Is there any codes, it seems not to
be done by searching spark doc ...
Thanks in advance!
My guess is No, unless you are okay to read the data and write it back
again.
On Tue, Jan 5, 2016 at 2:07 PM, Zhiliang Zhu
wrote:
>
> For some file on hdfs, it is necessary to copy/move it to some another
> specific hdfs directory, and the directory name would keep