You could set up an rpc server on a machine that does have hadoop installed.
Then, your clients could submit rpc requests to this machine, and your rpc
server would resubmit the job to hadoop.

-Michael

On 5/23/08 2:10 PM, "Natarajan, Senthil" <[EMAIL PROTECTED]> wrote:

> The client machine doesn't have Hadoop installed and it is not a slave
> machine.
> From the client machine data and task nodes are not seen.
>
> In this scenario how to load data to HDFS and submit the MapReduce job from
> client.
> Is it possible?
>
> If not what minimal things need to be setup so that the data and jobs can be
> submitted remotely from the client machine.
>
> Thanks,
> Senthil
>
> -----Original Message-----
> From: Ted Dunning [mailto:[EMAIL PROTECTED]
> Sent: Friday, May 23, 2008 4:52 PM
> To: core-user@hadoop.apache.org; '[EMAIL PROTECTED]'
> Subject: Re: Remote Job Submission
>
>
> Both are possible.  You may have to have access to the data and task nodes
> for some operations.  If you can see all of the nodes in your cluster, you
> should be able to do everything.
>
>
> On 5/23/08 1:46 PM, "Natarajan, Senthil" <[EMAIL PROTECTED]> wrote:
>
>> Hi,
>> I was wondering is it possible to submit MapReduce job on remote Hadoop
>> cluster.
>>
>> (i.e) Submitting the job from the machine which doesn't have Hadoop installed
>> and submitting to different machine where Hadoop installed.
>> Is it possible to do this?
>>
>> I guess at least data can be uploaded to HDFS through java program remotely
>> right?
>>
>> Thanks,
>> Senthil
>

Reply via email to