Thanks.. ill give that a try

On 9/6/10 2:02 PM, Harsh J wrote:
Java: You can use a DFSClient instance with a proper config object
(Configuration) from right about anywhere - basically all that matters is
the right fs.default.name value, which is your namenode's communication
point.

Can even use hadoop installation's 'bin/hadoop dfs' on a remote node
(without it acting as a proper node i.e. not in slaves or masters list) if
you want to use the scripts.

On 7 Sep 2010 01:43, "Mark"<static.void....@gmail.com>  wrote:

  How do I go about uploading content from a remote machine to the hadoop
cluster? Do I have to first move the data to one of the nodes and then do a
fs -put or is there some client I can use to just access an existing
cluster?

Thanks

Reply via email to