Hi Wei -

In general, settings changes aren't applied until the hadoop daemons are
restarted.  Sounds like someone enabled permissions previously, but they
didn't take hold until you rebooted your cluster.

cheers,
-James

On Mon, Apr 25, 2011 at 1:19 AM, Peng, Wei <wei.p...@xerox.com> wrote:

> I forgot to mention that the hadoop was running fine before.
> However, after it crashed last week, the restarted hadoop cluster has
> such permission issues.
> So that means the settings are still as same as before.
> Then what would be the cause?
>
> Wei
>
> -----Original Message-----
> From: James Seigel [mailto:ja...@tynt.com]
> Sent: Sunday, April 24, 2011 5:36 AM
> To: common-user@hadoop.apache.org
> Subject: Re: HDFS permission denied
>
> Check where the hadoop tmp setting is pointing to.
>
> James
>
> Sent from my mobile. Please excuse the typos.
>
> On 2011-04-24, at 12:41 AM, "Peng, Wei" <wei.p...@xerox.com> wrote:
>
> > Hi,
> >
> >
> >
> > I need a help very bad.
> >
> >
> >
> > I got an HDFS permission error by starting to run hadoop job
> >
> > org.apache.hadoop.security.AccessControlException: Permission denied:
> >
> > user=wp, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x
> >
> >
> >
> > I have the right permission to read and write files to my own hadoop
> > user directory.
> >
> > It works fine when I use hadoop fs -put. The job input and output are
> > all from my own hadoop user directory.
> >
> >
> >
> > It seems that when a job starts running, some data need to be written
> > into some directory, and I don't have the permission to that
> directory.
> > It is strange that the inode does not show which directory it is.
> >
> >
> >
> > Why does hadoop write something to a directory with my name secretly?
> Do
> > I need to be set a particular user group?
> >
> >
> >
> > Many Thanks..
> >
> >
> >
> > Vivian
> >
> >
> >
> >
> >
>

Reply via email to