Thanks John!

There is the complete solution:


Configuration jc = new Configuration();
Object files[] = null;
List files_in_hdfs = new ArrayList();

FileSystem fs = FileSystem.get(jc);
FileStatus[] file_status = fs.listStatus(new Path(outputPath));
for (FileStatus fileStatus : file_status) {
  files_in_hdfs.add(fileStatus.getPath().getName());
}

files = files_in_hdfs.toArray();

2011/10/10 John Conwell <j...@iamjohn.me>

> FileStatus[] files = fs.listStatus(new Path(path));
>
> for (FileStatus fileStatus : files)
>
> {
>
> //...do stuff ehre
>
> }
>
> On Mon, Oct 10, 2011 at 8:03 AM, Raimon Bosch <raimon.bo...@gmail.com
> >wrote:
>
> > Hi,
> >
> > I'm wondering how can I browse an hdfs folder using the classes
> > in org.apache.hadoop.fs package. The operation that I'm looking for is
> > 'hadoop dfs -ls'
> >
> > The standard file system equivalent would be:
> >
> > File f = new File(outputPath);
> > if(f.isDirectory()){
> >  String files[] = f.list();
> >  for(String file : files){
> >    //Do your logic
> >  }
> > }
> >
> > Thanks in advance,
> > Raimon Bosch.
> >
>
>
>
> --
>
> Thanks,
> John C
>

Reply via email to