Thanks, but hadoop dfs is deprecated, hdfs is the recommended way. In any case, the result is exactly the same.

use  bin/hadoop dfs -lsr har:///sample/test.har
not  dfs -ls -R har:///sample/test.har



On Tue, Oct 2, 2012 at 11:42 AM, Alexander Hristov <al...@planetalia.com <mailto:al...@planetalia.com>> wrote:

    Hello

    I'm trying to test the Hadoop archive functionality under 0.23 and
    I can't get it working.

    I have in HDFS a /test folder with  several text files. I created
    a hadoop archive using

    hadoop archive -archiveName test.har -p /test *.txt  /sample

    Ok, this creates a /sample/test.har with the appropriate parts
    (_index, _SUCCESS,_masterindex,part-0).  Performing a cat on
    _index shows the texts files.
    However, when I try to even list the contents of the HAR file using

    hdfs dfs -ls -R har:///sample/test.har

    I simply get "har:///sample/test.har : No such file or directory"!
    WTF?

    Accessing the individual files does work, however:

    hdfs dfs -cat har:///sample/test.har/file.txt

    works

    Regards

    Alexander



Reply via email to