Hi Nicholas,
Thanks it helped.

I gave permission 777 for /user
So now user "Test" can perform HDFS operations.

And also I gave permission 777 for /usr/local/hadoop/datastore on the master.

When user "Test" tries to submit the MapReduce job, getting this error

Exception in thread "main" org.apache.hadoop.ipc.RemoteException: 
org.apache.hadoop.fs.permission.AccessControlException: Permission denied: 
user=test, access=WRITE, inode="datastore":hadoop:supergroup:rwxr-xr-x

Where else I need to give permission so that user "Test" can submit jobs using 
jobtracker and Datanode started by user "hadoop".

Thanks,
Senthil

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Sent: Wednesday, May 07, 2008 5:49 PM
To: core-user@hadoop.apache.org
Subject: Re: Hadoop Permission Problem

Hi Senthil,

Since the path "myapps" is relative, copyFromLocal will copy the file to the 
home directory, i.e. /user/Test/myapps in your case.  If /user/Test doesn't not 
exist, it will first try to create it.  You got AccessControlException because 
the permission of /user is 755.

Hope this helps.

Nicholas



----- Original Message ----
From: "Natarajan, Senthil" <[EMAIL PROTECTED]>
To: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
Sent: Wednesday, May 7, 2008 2:36:22 PM
Subject: Hadoop Permission Problem

Hi,
My datanode and jobtracker are started by user "hadoop".
And user "Test" needs to submit the job. So if the user "Test" copies file to 
HDFS, there is a permission error.
/usr/local/hadoop/bin/hadoop dfs -copyFromLocal /home/Test/somefile.txt myapps
copyFromLocal: org.apache.hadoop.fs.permission.AccessControlException: 
Permission denied: user=Test, access=WRITE, 
inode="user":hadoop:supergroup:rwxr-xr-x
Could you please let me know how other users (other than hadoop) can access 
HDFS and then submit MapReduce jobs. Where to configure or what default 
configuration needs to be changed.

Thanks,
Senthil

Reply via email to