I installed Hortonworks 2.6.2 with Ranger and the Hive plugin using Ambari (no
Kerberos/LDAP)
I created a Linux user called Henry and assigned him some groups.
Created several Hive tables using Beeline as a sudo user that installed the
software.
Connected to beeline as "henry" and perform "show databases". I get back a "no
permissions" error as expected.
I created a policy in Ranger and granted Henry "select" privilege on a table in
one of the Hive databases.
Henry connects to beeline.
"show databases" return the database that contains the table that Henry now has
select privilege.
"show tables" return the table that Henry has been granted select privilege.
However, when Henry tries to select, it gets a no SELECT privilege error.
I have tried the same exercise with different users, tables, and privileges and
the DML operations never succeed.
I checked the logs and it looks like Hive is contacting Ranger to get
privileges as expected:
2018-04-27 23:48:19,349 ERROR [HiveServer2-Handler-Pool: Thread-91]: ql.Driver
(SessionState.java:printError(993)) - FAILED: HiveAccessControlException
Permission denied: user [henry] does not have [SELECT] privilege on
[default/customer]
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException:
Permission denied: user [henry] does not have [SELECT] privilege on
[default/customer]
at
org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:460)
at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:856)
at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:644)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:511)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:321)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1221)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1215)
at
org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:146)
at
org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:226)
at
org.apache.hive.service.cli.operation.Operation.run(Operation.java:264)
at
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:470)
at
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:457)
at
org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:313)
at
org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:509)
at
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1317)
at
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1302)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at
org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
I am at a loss on how to proceed. Any suggestions?
Bert