That is basically what I thought, but I didn't successfully enabled the debug logs. Seems like my problem was that groups are case-sensitive in Ranger plugin for Hive, and sysadmin is not recognized as SysAdmin.
Thanks a lot for your help :-) Loïc Loïc CHANEL Engineering student at TELECOM Nancy Trainee at Worldline - Villeurbanne 2015-07-30 19:48 GMT+02:00 Alok Lal <a...@hortonworks.com>: > Loic, > Answer would lie in log entries before the lines that report failure and > show exception that you have produced below. Can you turn on debugging and > post the log? Start by turning it on for com.xasecure.authorization.hive > level. If cause does not jump out at you then turn it up to com.xasecure. > > > From: Loïc Chanel > Reply-To: "user@ranger.incubator.apache.org" > Date: Thursday, July 30, 2015 at 3:01 AM > To: "user@ranger.incubator.apache.org" > > Subject: Re: Hive server identity assertion > > Sorry for my late answer, I had to work on a different problem. > In the meantime, I realized that I am using anger 0.4, and not Ranger 0.5. > So this problem may have been solved in Ranger 0.5. > Here is all the the logs I get when my user toto tries to access chaneldb > on which he should have permission to read because he belongs to the group > sysadmin which has all the rights (including admin) on the database : > > 2015-07-30 11:50:49,891 INFO [HiveServer2-Handler-Pool: Thread-48]: > parse.ParseDriver (ParseDriver.java:parse(185)) - Parsing command: use > chaneldb > 2015-07-30 11:50:50,295 INFO [HiveServer2-Handler-Pool: Thread-48]: > parse.ParseDriver (ParseDriver.java:parse(206)) - Parse Completed > 2015-07-30 11:50:50,297 INFO [HiveServer2-Handler-Pool: Thread-48]: > log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - </PERFLOG method=parse > start=1438249849885 end=1438249850297 duration=412 > from=org.apache.hadoop.hive.ql.Driver> > 2015-07-30 11:50:50,302 INFO [HiveServer2-Handler-Pool: Thread-48]: > log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) - <PERFLOG > method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver> > 2015-07-30 11:50:50,347 INFO [HiveServer2-Handler-Pool: Thread-48]: > metastore.HiveMetaStore (HiveMetaStore.java:logInfo(714)) - 2: > get_database: chaneldb > 2015-07-30 11:50:50,347 INFO [HiveServer2-Handler-Pool: Thread-48]: > HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(340)) - ugi=toto > ip=unknown-ip-addr cmd=get_database: chaneldb > 2015-07-30 11:50:50,348 INFO [HiveServer2-Handler-Pool: Thread-48]: > metastore.HiveMetaStore (HiveMetaStore.java:newRawStore(557)) - 2: Opening > raw store with implemenation > class:org.apache.hadoop.hive.metastore.ObjectStore > 2015-07-30 11:50:50,350 INFO [HiveServer2-Handler-Pool: Thread-48]: > metastore.ObjectStore (ObjectStore.java:initialize(262)) - ObjectStore, > initialize called > 2015-07-30 11:50:50,371 INFO [HiveServer2-Handler-Pool: Thread-48]: > metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(131)) - Using > direct SQL, underlying DB is MYSQL > 2015-07-30 11:50:50,371 INFO [HiveServer2-Handler-Pool: Thread-48]: > metastore.ObjectStore (ObjectStore.java:setConf(245)) - Initialized > ObjectStore > 2015-07-30 11:50:50,391 INFO [HiveServer2-Handler-Pool: Thread-48]: > metadata.HiveUtils > (HiveUtils.java:getMetaStoreAuthorizeProviderManagers(353)) - Adding > metastore authorization provider: > org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider > 2015-07-30 11:50:50,395 INFO [HiveServer2-Handler-Pool: Thread-48]: > metadata.HiveUtils > (HiveUtils.java:getMetaStoreAuthorizeProviderManagers(353)) - Adding > metastore authorization provider: > org.apache.hadoop.hive.ql.security.authorization.MetaStoreAuthzAPIAuthorizerEmbedOnly > 2015-07-30 11:50:50,427 INFO [HiveServer2-Handler-Pool: Thread-48]: > ql.Driver (Driver.java:compile(429)) - Semantic Analysis Completed > 2015-07-30 11:50:50,427 INFO [HiveServer2-Handler-Pool: Thread-48]: > log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - </PERFLOG > method=semanticAnalyze start=1438249850302 end=1438249850427 duration=125 > from=org.apache.hadoop.hive.ql.Driver> > 2015-07-30 11:50:50,440 INFO [HiveServer2-Handler-Pool: Thread-48]: > ql.Driver (Driver.java:getSchema(237)) - Returning Hive schema: > Schema(fieldSchemas:null, properties:null) > 2015-07-30 11:50:50,440 INFO [HiveServer2-Handler-Pool: Thread-48]: > log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) - <PERFLOG > method=doAuthorization from=org.apache.hadoop.hive.ql.Driver> > 2015-07-30 11:50:50,486 INFO [HiveServer2-Handler-Pool: Thread-48]: > log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - </PERFLOG > method=doAuthorization start=1438249850440 end=1438249850486 duration=46 > from=org.apache.hadoop.hive.ql.Driver> > > ==> /var/log/hive/hive-server2.log <== > FAILED: HiveAccessControlException Permission denied: user [toto] does not > have [USE] privilege on [chaneldb] > > ==> /var/log/hive/hiveserver2.log <== > 2015-07-30 11:50:50,487 ERROR [HiveServer2-Handler-Pool: Thread-48]: > ql.Driver (SessionState.java:printError(833)) - FAILED: > HiveAccessControlException Permission denied: user [toto] does not have > [USE] privilege on [chaneldb] > org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: > Permission denied: user [toto] does not have [USE] privilege on [chaneldb] > at > com.xasecure.authorization.hive.authorizer.XaSecureHiveAuthorizer.checkPrivileges(XaSecureHiveAuthorizer.java:254) > at > org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:727) > at > org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:520) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:457) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:305) > at > org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1069) > at > org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1063) > at > org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:109) > at > org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:180) > at > org.apache.hive.service.cli.operation.Operation.run(Operation.java:256) > at > org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:376) > at > org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:363) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) > at java.lang.reflect.Method.invoke(Unknown Source) > at > org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79) > at > org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37) > at > org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Unknown Source) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:536) > at > org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60) > at com.sun.proxy.$Proxy28.executeStatementAsync(Unknown Source) > at > org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:270) > at > org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:401) > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313) > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298) > at > org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > at > org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206) > at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown > Source) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown > Source) > at java.lang.Thread.run(Unknown Source) > > 2015-07-30 11:50:50,488 INFO [HiveServer2-Handler-Pool: Thread-48]: > log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - </PERFLOG method=compile > start=1438249849844 end=1438249850488 duration=644 > from=org.apache.hadoop.hive.ql.Driver> > 2015-07-30 11:50:50,488 INFO [HiveServer2-Handler-Pool: Thread-48]: > log.PerfLogger (PerfLogger.java:PerfLogBegin(108)) - <PERFLOG > method=releaseLocks from=org.apache.hadoop.hive.ql.Driver> > 2015-07-30 11:50:50,488 INFO [HiveServer2-Handler-Pool: Thread-48]: > log.PerfLogger (PerfLogger.java:PerfLogEnd(135)) - </PERFLOG > method=releaseLocks start=1438249850488 end=1438249850488 duration=0 > from=org.apache.hadoop.hive.ql.Driver> > 2015-07-30 11:50:50,490 WARN [HiveServer2-Handler-Pool: Thread-48]: > thrift.ThriftCLIService (ThriftCLIService.java:ExecuteStatement(407)) - > Error executing statement: > org.apache.hive.service.cli.HiveSQLException: Error while compiling > statement: FAILED: HiveAccessControlException Permission denied: user > [toto] does not have [USE] privilege on [chaneldb] > at > org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:314) > at > org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:111) > at > org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:180) > at > org.apache.hive.service.cli.operation.Operation.run(Operation.java:256) > at > org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:376) > at > org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:363) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) > at java.lang.reflect.Method.invoke(Unknown Source) > at > org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:79) > at > org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:37) > at > org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:64) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Unknown Source) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:536) > at > org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:60) > at com.sun.proxy.$Proxy28.executeStatementAsync(Unknown Source) > at > org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:270) > at > org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:401) > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313) > at > org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298) > at > org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > at > org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206) > at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown > Source) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown > Source) > at java.lang.Thread.run(Unknown Source) > Caused by: > org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: > Permission denied: user [toto] does not have [USE] privilege on [chaneldb] > at > com.xasecure.authorization.hive.authorizer.XaSecureHiveAuthorizer.checkPrivileges(XaSecureHiveAuthorizer.java:254) > at > org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:727) > at > org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:520) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:457) > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:305) > at > org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1069) > at > org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1063) > at > org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:109) > ... 28 more > > And when I'm trying the command "groups" or even "hdfs groups" on the host > running HiveServer I get "toto : nobody UsrSysAdmin SysAdmin ..." > > Do you or anyone else see where the problem might come from ? > Thanks in advance, > > > Loïc > > Loïc CHANEL > Engineering student at TELECOM Nancy > Trainee at Worldline - Villeurbanne > > 2015-07-24 17:30 GMT+02:00 Alok Lal <a...@hortonworks.com>: > >> Perhaps. It is hard to say definitively without taking a look at the logs. >> >> From: Loïc Chanel <loic.cha...@telecomnancy.net> >> Reply-To: "user@ranger.incubator.apache.org" < >> user@ranger.incubator.apache.org> >> Date: Friday, July 24, 2015 at 8:10 AM >> To: "user@ranger.incubator.apache.org" <user@ranger.incubator.apache.org> >> >> >> Subject: Re: Hive server identity assertion >> >> Exactly ! >> >> And I've checked the logs once again, but I can't see any groups >> mentioned. Does this reveal a special issue ? >> >> Thanks, >> >> >> Loïc >> >> Loïc CHANEL >> Engineering student at TELECOM Nancy >> Trainee at Worldline - Villeurbanne >> >> 2015-07-24 16:23 GMT+02:00 Alok Lal <a...@hortonworks.com>: >> >>> If the user groups couldn't be asserted, would I see a log indicating >>> that the user cannot be impersonated (like Knox prompts) ? >>> >>> Yes log should show the user and group info being sent to policy >>> engine. For authorizing. I presume you are using ranger 0.5 to connect >>> via beeline to a hiveserver2 instance. Right? (Not that these matter, >>> just to set context.) >>> >>> Thanks >>> >>> From: Loïc Chanel <loic.cha...@telecomnancy.net> >>> Reply-To: "user@ranger.incubator.apache.org" < >>> user@ranger.incubator.apache.org> >>> Date: Friday, July 24, 2015 at 12:53 AM >>> To: "user@ranger.incubator.apache.org" <user@ranger.incubator.apache.org >>> > >>> Subject: Re: Hive server identity assertion >>> >>> Well, that's what I thought, but the command hdfs groups returns me a >>> group that I use for a policy giving access to a database, and as I get the >>> message "HiveAccessControlException Permission denied" when accessing this >>> database, I think Hive cannot assert the groups the user belongs to. >>> >>> I'm using Hive 0.14.0.2.2. >>> As the problem might come from this, I think it's important to mention >>> that the users are synchronized from a LDAP via SSSD. >>> >>> If the user groups couldn't be asserted, would I see a log indicating >>> that the user cannot be impersonated (like Knox prompts) ? >>> >>> Thanks, >>> >>> >>> Loïc >>> >>> Loïc CHANEL >>> Engineering student at TELECOM Nancy >>> Trainee at Worldline - Villeurbanne >>> >>> 2015-07-23 20:09 GMT+02:00 Don Bosco Durai <bo...@apache.org>: >>> >>>> Hive uses the same core-site.xml settings as HDFS. So if the group >>>> mapping work in HDFS, then it should work in Hive also. >>>> >>>> And if the user and groups are in linux/unix, then it should have been >>>> support out of the box. >>>> >>>> What version of Hive are you using? (It shouldn’t matter) >>>> >>>> Thanks >>>> >>>> Bosco >>>> >>>> >>>> From: Loïc Chanel <loic.cha...@telecomnancy.net> >>>> Reply-To: "user@ranger.incubator.apache.org" < >>>> user@ranger.incubator.apache.org> >>>> Date: Thursday, July 23, 2015 at 3:10 AM >>>> To: "user@ranger.incubator.apache.org" < >>>> user@ranger.incubator.apache.org> >>>> Subject: Hive server identity assertion >>>> >>>> Hi all, >>>> >>>> As I am now exploring how Ranger works with Hive, I made some policies, >>>> but it seems that group policies are not enforced. >>>> Therefore, I was wondering how the Ranger plugin running on Hive was >>>> asserting the user's identity. >>>> >>>> I am even more surprised by the fact that I do not have any problem >>>> with Ranger plugin working on HDFS, which is running on the exact same >>>> node. >>>> >>>> In parallel, I know that Know plugin, for example, runs in a totally >>>> different way, but as it seems that, as does HBase, Hive does not provide >>>> with any user mapping function, I thought the identity would be asserted on >>>> the node Hive Server is running on, as if the user was a Unix one. >>>> >>>> Do someone as an idea about how the user groups can be founded by Hive >>>> Ranger plugin ? >>>> Thanks in advance, >>>> >>>> >>>> Loïc >>>> >>>> Loïc CHANEL >>>> Engineering student at TELECOM Nancy >>>> Trainee at Worldline - Villeurbanne >>>> >>>> >>> >> >