You can use Hadoop Client Api to remove files https://hadoop.apache.org/docs/current/api/org/apache/hadoop/fs/FileSystem.html#delete(org.apache.hadoop.fs.Path, boolean). I don't think spark has any wrapper on hadoop filesystem APIs.
On Thu, Jan 22, 2015 at 12:15 PM, LinQili <lin_q...@outlook.com> wrote: > Hi, all > I wonder how to delete hdfs file/directory using spark API? >