On Fri, Nov 20, 2009 at 1:25 PM, Jeff Zhang <zjf...@gmail.com> wrote:

> In the client machine, configure the core-site.xml, hdfs.xml and mapred.xml
> as you do in the hadoop cluster.
>

if you are referring to client machine as a place from where you launch M/R
jobs

 There is no client machine as such . It just one machine am experimenting
on (pseudo distributed file system)

then you can run dfs shell command in the client's machine
>
>
> Jeff Zhang
>
>
>
> On Fri, Nov 20, 2009 at 3:38 PM, Siddu <siddu.s...@gmail.com> wrote:
>
> > Hello all,
> >
> > I am not sure if the question is framed right !
> >
> > Lets say user1 launches an instance of hadoop on *single node* , and
> hence
> > he has permission to create,delete files on hdfs or launch M/R jobs .
> >
> > now what should i do if user2 wants to use the same instance of hadoop
> > which
> > is launched by user1  and needs permission to create delete files on hdfs
> > or
> > launch M/R jobs
> >
> > i am using 0.20 version ..... of hadoop and Ubuntu as the host machine
> >
> > Thanks for any inputs
> >
> >
> > --
> > Regards,
> > ~Sid~
> > I have never met a man so ignorant that i couldn't learn something from
> him
> >
>



-- 
Regards,
~Sid~
I have never met a man so ignorant that i couldn't learn something from him

Reply via email to