[ 
https://issues.apache.org/jira/browse/HIVE-20001?focusedWorklogId=476979&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-476979
 ]

ASF GitHub Bot logged work on HIVE-20001:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 01/Sep/20 03:26
            Start Date: 01/Sep/20 03:26
    Worklog Time Spent: 10m 
      Work Description: cravani commented on pull request #1450:
URL: https://github.com/apache/hive/pull/1450#issuecomment-684173629


   Raised by mistake, please ignore.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 476979)
    Time Spent: 40m  (was: 0.5h)

> With doas set to true, running select query as hrt_qa user on external table 
> fails due to permission denied to read /warehouse/tablespace/managed 
> directory.
> ------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-20001
>                 URL: https://issues.apache.org/jira/browse/HIVE-20001
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 3.0.0
>            Reporter: Jaume M
>            Assignee: Jaume M
>            Priority: Critical
>              Labels: pull-request-available
>         Attachments: HIVE-20001.1.patch, HIVE-20001.1.patch, 
> HIVE-20001.2.patch, HIVE-20001.3.patch, HIVE-20001.4.patch, HIVE-20001.5.patch
>
>          Time Spent: 40m
>  Remaining Estimate: 0h
>
> Hive: With doas set to true, running select query as hrt_qa user on external 
> table fails due to permission denied to read /warehouse/tablespace/managed 
> directory.
> Steps: 
> 1. Create a external table.
> 2. Set doas to true.
> 3. run select count(*) using user hrt_qa.
> Table creation query.
> {code}
> beeline -n hrt_qa -p pwd -u 
> "jdbc:hive2://ctr-e138-1518143905142-375925-01-000006.hwx.site:2181,ctr-e138-1518143905142-375925-01-000005.hwx.site:2181,ctr-e138-1518143905142-375925-01-000007.hwx.site:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/_h...@example.com;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/etc/security/serverKeys/hivetruststore.jks;trustStorePassword=changeit"
>  --outputformat=tsv -e "drop table if exists test_table purge;
> create external table test_table(id int, age int) row format delimited fields 
> terminated by '|' stored as textfile;
> load data inpath '/tmp/table1.dat' overwrite into table test_table;
> {code}
> select count(*) query execution fails
> {code}
> beeline -n hrt_qa -p pwd -u 
> "jdbc:hive2://ctr-e138-1518143905142-375925-01-000006.hwx.site:2181,ctr-e138-1518143905142-375925-01-000005.hwx.site:2181,ctr-e138-1518143905142-375925-01-000007.hwx.site:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/_h...@example.com;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/etc/security/serverKeys/hivetruststore.jks;trustStorePassword=changeit"
>  --outputformat=tsv -e "select count(*) from test_table where age>30 and 
> id<10100;"
> 2018-06-22 10:22:29,328|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|SLF4J: Class path contains 
> multiple SLF4J bindings.
> 2018-06-22 10:22:29,330|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|SLF4J: See 
> http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 2018-06-22 10:22:29,335|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|SLF4J: Actual binding is of 
> type [org.apache.logging.slf4j.Log4jLoggerFactory]
> 2018-06-22 10:22:31,408|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|Format tsv is deprecated, 
> please use tsv2
> 2018-06-22 10:22:31,529|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|Connecting to 
> jdbc:hive2://ctr-e138-1518143905142-375925-01-000006.hwx.site:2181,ctr-e138-1518143905142-375925-01-000005.hwx.site:2181,ctr-e138-1518143905142-375925-01-000007.hwx.site:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/_h...@example.com;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/etc/security/serverKeys/hivetruststore.jks;trustStorePassword=changeit
> 2018-06-22 10:22:32,031|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|18/06/22 10:22:32 [main]: 
> INFO jdbc.HiveConnection: Connected to 
> ctr-e138-1518143905142-375925-01-000004.hwx.site:10001
> 2018-06-22 10:22:34,130|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|18/06/22 10:22:34 [main]: 
> WARN jdbc.HiveConnection: Failed to connect to 
> ctr-e138-1518143905142-375925-01-000004.hwx.site:10001
> 2018-06-22 10:22:34,244|INFO|Thread-126|machine.py:111 - 
> tee_pipe()||b3a493ec-99be-483e-91fe-4b701ec27ebc|18/06/22 10:22:34 [main]: 
> WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: 
> jdbc:hive2://ctr-e138-1518143905142-375925-01-000004.hwx.site:10001/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2;principal=hive/_h...@example.com;transportMode=http;httpPath=cliservice;ssl=true;sslTrustStore=/etc/security/serverKeys/hivetruststore.jks;trustStorePassword=changeit:
>  Failed to open new session: 
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:java.security.AccessControlException: Permission 
> denied: user=hrt_qa, access=READ, 
> inode="/warehouse/tablespace/managed/hive":hive:hadoop:drwx------
> {code}
> warehouse directory - 
> {code}
> -bash-4.2$ hdfs dfs -ls /warehouse/tablespace/
> Found 2 items
> drwxr-xr-x   - hdfs hdfs          0 2018-06-22 07:01 
> /warehouse/tablespace/external
> drwxr-xr-x   - hdfs hdfs          0 2018-06-22 07:01 
> /warehouse/tablespace/managed
> -bash-4.2$ hdfs dfs -ls /warehouse/tablespace/managed/hive
> Found 5 items
> drwxrwx---+  - hive hadoop          0 2018-06-22 09:28 
> /warehouse/tablespace/managed/hive/all10kw
> drwxrwx---+  - hive hadoop          0 2018-06-22 09:24 
> /warehouse/tablespace/managed/hive/hive8295
> drwxrwx---+  - hive hadoop          0 2018-06-22 07:20 
> /warehouse/tablespace/managed/hive/information_schema.db
> drwxrwxrwx+  - hive hadoop          0 2018-06-22 07:20 
> /warehouse/tablespace/managed/hive/sys.db
> drwxrwx---+  - hive hadoop          0 2018-06-22 09:27 
> /warehouse/tablespace/managed/hive/tbl1002
> -bash-4.2$ hdfs dfs -ls /warehouse/tablespace/external/hive
> Found 2 items
> drwxr-xr-x+  - hive hadoop          0 2018-06-22 07:02 
> /warehouse/tablespace/external/hive/sys.db
> drwxrwxrwx+  - hive hadoop          0 2018-06-22 10:12 
> /warehouse/tablespace/external/hive/test_table
> -bash-4.2$ exit
> logout
> {code}
> It looks like the code still assumes external tables to be present under 
> '/warehouse/tablespace/managed' directory similar to earlier 
> '/apps/hive/warehouse'. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to