Sonal, Humm... Thanks for your reply! But not aware of dfs security is being used here (Hadoop0.20.2 + Hbase 0.20.3). Is this the top level directory related to the following the hbase rootdir, and are you suggest to create the following? bin/hadoop fs -mkdir /activity/hbase bin/hadoop fs -chown hbase /activity/hbase Below is the definition of the root dir: <name>hbase.rootdir</name> <value>hdfs://hadoopStressServer/activity/hbase</value>
Regards, Andy Zhong -----Original Message----- From: Sonal Goyal [mailto:[email protected]] Sent: Friday, August 19, 2011 3:10 AM To: [email protected] Subject: Re: Not Starting Hmaster bacause of Permission denied Hi Andy, I guess you are using dfs security, in which case your hbase user does not have the permission to create the top level directory /hbase in the dfs. Can you try the following and then start your master? Let us know how it goes. bin/hadoop fs -mkdir /hbase bin/hadoop fs -chown hbase /hbase Best Regards, Sonal Crux: Reporting for HBase <https://github.com/sonalgoyal/crux> Nube Technologies <http://www.nubetech.co> <http://in.linkedin.com/in/sonalgoyal> On Fri, Aug 19, 2011 at 1:02 PM, Zhong, Andy <[email protected]>wrote: > Hi All, > I am facing an issue to start Hmaster bacause of Permission denied, > although two region servers seem start properly. I will be much > appreciated if any one could help me to figure out the root cause. I > have hadoop running under 'hadoop' user, and Hbase running under 'hbase' > user. Thanks, - Andy Zhong > > > 2011-08-19 01:30:31,205 FATAL master.HMaster - Not starting HMaster > because: > org.apache.hadoop.security.AccessControlException: > org.apache.hadoop.security.AccessControlException: Permission denied: > user=hbase, access=WRITE, inode="":hadoop:supergroup:rwxr-xr-x > at > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructo > rA > ccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCo > ns > tructorAccessorImpl.java:27) > at > java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteExcep > ti > on.java:96) > at > org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteExce > pt > ion.java:58) > at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:910) > at > org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSys > te > m.java:262) > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1115) > at > org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:205) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructo > rA > ccessorImpl.java:39) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingCo > ns > tructorAccessorImpl.java:27) > at > java.lang.reflect.Constructor.newInstance(Constructor.java:513) > at > org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1241) > at > org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1282) > > This message, including any attachments, is the property of Sears > Holdings Corporation and/or one of its subsidiaries. It is > confidential and may contain proprietary or legally privileged > information. If you are not the intended recipient, please delete it > without reading the contents. Thank you. > This message, including any attachments, is the property of Sears Holdings Corporation and/or one of its subsidiaries. It is confidential and may contain proprietary or legally privileged information. If you are not the intended recipient, please delete it without reading the contents. Thank you.
