[ 
https://issues.apache.org/jira/browse/DRILL-5565?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ehur updated DRILL-5565:
------------------------
    Environment: CentOS release 6.8

> Directory Query fails with Permission denied: access=EXECUTE if dirN name is 
> 'year=2017' or 'month=201704'
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: DRILL-5565
>                 URL: https://issues.apache.org/jira/browse/DRILL-5565
>             Project: Apache Drill
>          Issue Type: Bug
>          Components: Functions - Drill, SQL Parser
>    Affects Versions: 1.6.0
>         Environment: CentOS release 6.8
>            Reporter: ehur
>
> running a query like this works fine, when the name dir0 contains numerics 
> only:
> select * from all.my.records
> where dir0 >= '20170322'
> limit 10;
> if the dirN is named according to this convention: year=2017 we get one of 
> the following problems:
> 1. Either "system error permission denied" in:
> select * from all.my.records
> where dir0 >= 'year=2017'
> limit 10;
>  SYSTEM ERROR: RemoteException: Permission denied: user=myuser, 
> access=EXECUTE,
> inode: 
> /user/myuser/all/my/records/year=2017/month=201701/day=20170101/application_1485464650247_1917/part-r-00000.gz.parquet":myuser:supergroup:-rw-r--r--
>       at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
>       at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
>       at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
>       at 
> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6609)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4223)
>       at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:894)
>       at 
> org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
>       at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
>       at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
>       at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
>       at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)
>       at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
>       at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)
> 2. OR, if the where clause only specifies numerics in the dirname, it does 
> not blow up, but neither does it return the relevant data, since that where 
> clause is not the correct path to our data:
> select * from all.my.records
> where dir0 >= '2017'
> limit 10;



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to