wecharyu commented on code in PR #6047:
URL: https://github.com/apache/hive/pull/6047#discussion_r2319480663


##########
standalone-metastore/metastore-common/src/main/java/org/apache/hadoop/hive/metastore/utils/HdfsUtils.java:
##########
@@ -65,57 +64,35 @@ public class HdfsUtils {
   /**
    * Check the permissions on a file.
    * @param fs Filesystem the file is contained in
-   * @param stat Stat info for the file
+   * @param path the file path
    * @param action action to be performed
    * @throws IOException If thrown by Hadoop
-   * @throws AccessControlException if the file cannot be accessed
    */
-  public static void checkFileAccess(FileSystem fs, FileStatus stat, FsAction 
action)
-      throws IOException, LoginException {
-    checkFileAccess(fs, stat, action, SecurityUtils.getUGI());
+  public static void checkFileAccess(FileSystem fs, Path path, FsAction action)
+      throws IOException {
+    checkFileAccess(fs, path, action, SecurityUtils.getUGI());
   }
 
   /**
    * Check the permissions on a file
    * @param fs Filesystem the file is contained in
-   * @param stat Stat info for the file
+   * @param path the file path
    * @param action action to be performed
    * @param ugi user group info for the current user.  This is passed in so 
that tests can pass
    *            in mock ones.
    * @throws IOException If thrown by Hadoop
-   * @throws AccessControlException if the file cannot be accessed
    */
   @VisibleForTesting
-  static void checkFileAccess(FileSystem fs, FileStatus stat, FsAction action,
+  static void checkFileAccess(FileSystem fs, Path path, FsAction action,
                               UserGroupInformation ugi) throws IOException {
-
-    String user = ugi.getShortUserName();
-    String[] groups = ugi.getGroupNames();
-
-    if (groups != null) {
-      String superGroupName = fs.getConf().get("dfs.permissions.supergroup", 
"");
-      if (arrayContains(groups, superGroupName)) {
-        LOG.debug("User \"" + user + "\" belongs to super-group \"" + 
superGroupName + "\". " +
-            "Permission granted for action: " + action + ".");
-        return;
-      }
-    }
-
-    FsPermission dirPerms = stat.getPermission();
-
-    if (user.equals(stat.getOwner())) {
-      if (dirPerms.getUserAction().implies(action)) {
-        return;
-      }
-    } else if (arrayContains(groups, stat.getGroup())) {
-      if (dirPerms.getGroupAction().implies(action)) {
-        return;
-      }
-    } else if (dirPerms.getOtherAction().implies(action)) {
-      return;
+    try {
+      ugi.doAs((PrivilegedExceptionAction<Void>) () -> {
+        fs.access(path, action);

Review Comment:
   Yes, hdfs `access()` API will also skip checkPermission for super user: 
   
https://github.com/apache/hadoop/blob/4d7825309348956336b8f06a08322b78422849b1/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/namenode/FSDirectory.java#L1954-L1960
   



##########
standalone-metastore/metastore-common/src/main/java/org/apache/hadoop/hive/metastore/utils/HdfsUtils.java:
##########
@@ -65,57 +64,35 @@ public class HdfsUtils {
   /**
    * Check the permissions on a file.
    * @param fs Filesystem the file is contained in
-   * @param stat Stat info for the file
+   * @param path the file path
    * @param action action to be performed
    * @throws IOException If thrown by Hadoop
-   * @throws AccessControlException if the file cannot be accessed
    */
-  public static void checkFileAccess(FileSystem fs, FileStatus stat, FsAction 
action)
-      throws IOException, LoginException {
-    checkFileAccess(fs, stat, action, SecurityUtils.getUGI());
+  public static void checkFileAccess(FileSystem fs, Path path, FsAction action)
+      throws IOException {
+    checkFileAccess(fs, path, action, SecurityUtils.getUGI());
   }
 
   /**
    * Check the permissions on a file
    * @param fs Filesystem the file is contained in
-   * @param stat Stat info for the file
+   * @param path the file path
    * @param action action to be performed
    * @param ugi user group info for the current user.  This is passed in so 
that tests can pass
    *            in mock ones.
    * @throws IOException If thrown by Hadoop
-   * @throws AccessControlException if the file cannot be accessed
    */
   @VisibleForTesting
-  static void checkFileAccess(FileSystem fs, FileStatus stat, FsAction action,
+  static void checkFileAccess(FileSystem fs, Path path, FsAction action,
                               UserGroupInformation ugi) throws IOException {
-
-    String user = ugi.getShortUserName();
-    String[] groups = ugi.getGroupNames();
-
-    if (groups != null) {
-      String superGroupName = fs.getConf().get("dfs.permissions.supergroup", 
"");
-      if (arrayContains(groups, superGroupName)) {
-        LOG.debug("User \"" + user + "\" belongs to super-group \"" + 
superGroupName + "\". " +
-            "Permission granted for action: " + action + ".");
-        return;
-      }
-    }
-
-    FsPermission dirPerms = stat.getPermission();
-
-    if (user.equals(stat.getOwner())) {
-      if (dirPerms.getUserAction().implies(action)) {
-        return;
-      }
-    } else if (arrayContains(groups, stat.getGroup())) {
-      if (dirPerms.getGroupAction().implies(action)) {
-        return;
-      }
-    } else if (dirPerms.getOtherAction().implies(action)) {
-      return;
+    try {
+      ugi.doAs((PrivilegedExceptionAction<Void>) () -> {
+        fs.access(path, action);
+        return null;
+      });
+    } catch (InterruptedException e) {
+      throw new IOException(e);

Review Comment:
   `AccessControlException` will be thrown directly, and `InterruptedException` 
is usually not related to permission.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to