You can use the linux command hadoop fs -put to push files from local
filesystem, and -get to retrieve files

http://hadoop.apache.org/common/docs/r0.20.0/hdfs_shell.html#put
http://hadoop.apache.org/common/docs/r0.20.0/hdfs_shell.html#get

These work fine for single files or one-offs- if you need to write larger
volumes, or concatenate files, or want a higher degree of durability, you
might want to look at something like Flume (
https://github.com/cloudera/flume/wiki) or some of the vendor solutions
like MapR's.


On Tue, Nov 22, 2011 at 2:28 PM, Steve Ed <sediso...@gmail.com> wrote:

> Sorry for this novice question. I am trying to find the best way of moving
> (Copying) data in and out of HDFS.  There are bunch of tools available and
> I
> need to pick the one which offers the easiest way. I have seen MapR
> presentation, who claim to offer direct NFS mounts to feed data into HDFS.
>
> Is there anything similar available for apache HDFS ?
>
> Thanks in advance
>
>
>
>

Reply via email to