[ https://issues.apache.org/jira/browse/HDFS-15079?focusedWorklogId=791851&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-791851 ]
ASF GitHub Bot logged work on HDFS-15079: ----------------------------------------- Author: ASF GitHub Bot Created on: 18/Jul/22 05:40 Start Date: 18/Jul/22 05:40 Worklog Time Spent: 10m Work Description: ferhui commented on code in PR #4530: URL: https://github.com/apache/hadoop/pull/4530#discussion_r922982928 ########## hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/namenode/NameNode.java: ########## @@ -495,6 +498,94 @@ public static NameNodeMetrics getNameNodeMetrics() { return metrics; } + /** + * Try to obtain the actual client info according to the current user. + * @param ipProxyUsers Users who can override client infos + */ + private static String clientInfoFromContext( + final String[] ipProxyUsers) { + if (ipProxyUsers != null) { + UserGroupInformation user = + UserGroupInformation.getRealUserOrSelf(Server.getRemoteUser()); + if (user != null && + ArrayUtils.contains(ipProxyUsers, user.getShortUserName())) { + CallerContext context = CallerContext.getCurrent(); + if (context != null && context.isContextValid()) { + return context.getContext(); + } + } + } + return null; + } + + /** + * Try to obtain the value corresponding to the key by parsing the content. + * @param content the full content to be parsed. + * @param key trying to obtain the value of the key. + * @return the value corresponding to the key. + */ + @VisibleForTesting + public static String parseSpecialValue(String content, String key) { + int posn = content.indexOf(key); + if (posn != -1) { + posn += key.length(); + int end = content.indexOf(",", posn); + return end == -1 ? content.substring(posn) + : content.substring(posn, end); + } + return null; + } + + /** + * Try to obtain the actual client's machine according to the current user. + * @param ipProxyUsers Users who can override client infos. + * @return The actual client's machine. + */ + public static String getClientMachine(final String[] ipProxyUsers) { Review Comment: one question here. Where do we use this method? Didn't find it. Issue Time Tracking ------------------- Worklog Id: (was: 791851) Time Spent: 1h 50m (was: 1h 40m) > RBF: Client maybe get an unexpected result with network anomaly > ---------------------------------------------------------------- > > Key: HDFS-15079 > URL: https://issues.apache.org/jira/browse/HDFS-15079 > Project: Hadoop HDFS > Issue Type: Sub-task > Components: rbf > Affects Versions: 3.3.0 > Reporter: Hui Fei > Priority: Critical > Labels: pull-request-available > Attachments: HDFS-15079.001.patch, HDFS-15079.002.patch, > UnexpectedOverWriteUT.patch > > Time Spent: 1h 50m > Remaining Estimate: 0h > > I find there is a critical problem on RBF, HDFS-15078 can resolve it on some > Scenarios, but i have no idea about the overall resolution. > The problem is that > Client with RBF(r0, r1) create a file HDFS file via r0, it gets Exception and > failovers to r1 > r0 has been send create rpc to namenode(1st create) > Client create a HDFS file via r1(2nd create) > Client writes the HDFS file and close it finally(3rd close) > Maybe namenode receiving the rpc in order as follow > 2nd create > 3rd close > 1st create > And overwrite is true by default, this would make the file had been written > an empty file. This is an critical problem > We had encountered this problem. There are many hive and spark jobs running > on our cluster, sometimes it occurs -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org