Here is the policy definition:

{
"id":18,"guid":"65c08933-0ada-4577-b8bbe866c36e519",
"isEnabled":true,
"createdBy":"Admin",
"updatedBy":"Admin",
"createTime":1525901029063
,"updateTime":1525930710922,
"version":2,
"service":"hdp_hive",
"name":"abc_test",
"policyType":0,
"description":"",
"resourceSignature":"c15fb5ec332fe9828faa045100214819562b49f9880a0dd40ea339703fc73991",
"isAuditEnabled":true
,"resources":{"database":{"values":["default"],"isExcludes":false,"isRecursive":false},
"column":{"values":["all"],   "isExcludes":false, "isRecursive":false},
"table":{"values":["customer"],"isExcludes":false,"isRecursive":false}},
"policyItems":[{"accesses":[{"type":"select","isAllowed":true}],
"users":["mary","henry"],
"groups":[],"conditions":[],
"delegateAdmin":false}],"denyPolicyItems":[],
"allowExceptions":[],"denyExceptions":[],
"dataMaskPolicyItems":[],
"rowFilterPolicyItems":[]
}

show databases;
+----------------+--+
| database_name  |
+----------------+--+
| default        |
| tpcds          |
+----------------+--+
2 rows selected (0.358 seconds)

> show tables;
+-----------+--+
| tab_name  |
+-----------+--+
| customer  |
+-----------+--+
1 row selected (0.207 seconds)

> select count(*) from customer;
Error: Error while compiling statement: FAILED: HiveAccessControlException 
Permission denied: user [henry] does not have [SELECT] privilege on 
[default/customer] (state=42000,code=40000)

> select * from customer limit 2;
Error: Error while compiling statement: FAILED: HiveAccessControlException 
Permission denied: user [henry] does not have [SELECT] privilege on 
[default/customer/*] (state=42000,code=40000)

I am unable to view audit log data – returns “Error connecting to the search 
engine”.  I am using HDFS instead of SOLR.

    Bert

From: Madhan Neethiraj [mailto:[email protected]]
Sent: Wednesday, May 9, 2018 6:24 PM
To: [email protected]
Subject: Re: Unable to get ranger policies to work

Roberta,

Can you please add details of the policy you created and the query executed? 
Also, it will help to look at the contents of the audit log that shows ‘Deny’ 
for the query.

Madhan



From: Roberta Marton <[email protected]<mailto:[email protected]>>
Reply-To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Date: Wednesday, May 9, 2018 at 2:44 PM
To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject: Unable to get ranger policies to work

I installed Hortonworks 2.6.2 with Ranger and the Hive plugin using Ambari (no 
Kerberos/LDAP)
I created a Linux user called Henry and assigned him some groups.
Created several Hive tables using Beeline as a sudo user that installed the 
software.

Connected to beeline as “henry” and perform “show databases”. I get back a “no 
permissions” error as expected.

I created a policy in Ranger and granted Henry “select” privilege on a table in 
one of the Hive databases.

Henry connects to beeline.
“show databases” return the database that contains the table that Henry now has 
select privilege.
“show tables” return the table that Henry has been granted select privilege.
However, when Henry tries to select, it gets a no SELECT privilege error.

I have tried the same exercise with different users, tables, and privileges and 
the DML operations never succeed.

I checked the logs and it looks like Hive is contacting Ranger to get 
privileges as expected:

2018-04-27 23:48:19,349 ERROR [HiveServer2-Handler-Pool: Thread-91]: ql.Driver 
(SessionState.java:printError(993)) - FAILED: HiveAccessControlException 
Permission denied: user [henry] does not have [SELECT] privilege on 
[default/customer]
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException:
 Permission denied: user [henry] does not have [SELECT] privilege on 
[default/customer]
        at 
org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:460)
        at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:856)
        at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:644)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:511)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:321)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1221)
        at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1215)
        at 
org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:146)
        at 
org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:226)
        at 
org.apache.hive.service.cli.operation.Operation.run(Operation.java:264)
        at 
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:470)
        at 
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:457)
        at 
org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:313)
        at 
org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:509)
        at 
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1317)
        at 
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1302)
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
        at 
org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
        at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

I am at a loss on how to proceed.  Any suggestions?

   Bert

Reply via email to