Apache Hadoop qbt Report: branch-3.2+JDK8 on Linux/x86_64

2022-05-25 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/

[May 18, 2022 12:04:51 PM] (Szilard Nemeth) YARN-10850. TimelineService v2 
lists containers for all attempts when filtering for one. Contributed by 
Benjamin Teke
[May 18, 2022 12:23:56 PM] (Szilard Nemeth) YARN-11126. ZKConfigurationStore 
Java deserialisation vulnerability. Contributed by Tamas Domok
[May 19, 2022 3:33:23 AM] (Akira Ajisaka) YARN-10080. Support show app id on 
localizer thread pool (#4283)
[May 24, 2022 5:15:35 AM] (Masatake Iwasaki) YARN-11162. Set the zk acl for 
nodes created by ZKConfigurationStore. (#4350)
[May 25, 2022 8:33:04 AM] (Akira Ajisaka) HADOOP-18240. Upgrade Yetus to 0.14.0 
(#4328)




-1 overall


The following subsystems voted -1:
asflicense blanks hadolint pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 

Failed junit tests :

   hadoop.hdfs.TestRollingUpgrade 
   
hadoop.yarn.server.resourcemanager.reservation.TestCapacityOverTimePolicy 
   hadoop.yarn.server.timelineservice.security.TestTimelineAuthFilterForV2 
   hadoop.mapred.uploader.TestFrameworkUploader 
  

   cc:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-compile-cc-root.txt
 [48K]

   javac:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-compile-javac-root.txt
 [332K]

   blanks:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/blanks-eol.txt
 [13M]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/blanks-tabs.txt
 [2.0M]

   checkstyle:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-checkstyle-root.txt
 [14M]

   hadolint:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-hadolint.txt
 [8.0K]

   pathlen:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-pathlen.txt
 [16K]

   pylint:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-pylint.txt
 [148K]

   shellcheck:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-shellcheck.txt
 [20K]

   xml:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/xml.txt
 [16K]

   javadoc:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-javadoc-javadoc-root.txt
 [1.7M]

   unit:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
 [1.1M]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt
 [144K]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-tests.txt
 [16K]
  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-uploader.txt
 [12K]

   asflicense:

  
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-3.2-java8-linux-x86_64/48/artifact/out/results-asflicense.txt
 [4.0K]

Powered by Apache Yetus 0.14.0-SNAPSHOT   https://yetus.apache.org

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Apache Hadoop qbt Report: trunk+JDK11 on Linux/x86_64

2022-05-25 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/308/

[May 23, 2022 3:45:08 PM] (noreply) HDFS-16586. Purge FsDatasetAsyncDiskService 
threadgroup; it causes BP… (#4338)
[May 24, 2022 5:07:19 AM] (noreply) YARN-11162. Set the zk acl for nodes 
created by ZKConfigurationStore. (#4350)
[May 24, 2022 5:41:10 PM] (noreply) HADOOP-18249. Fix getUri() in HttpRequest 
has been deprecated. (#4335)
[May 25, 2022 12:21:04 AM] (noreply) YARN-11137. Improve log message in 
FederationClientInterceptor (#4336)




-1 overall


The following subsystems voted -1:
blanks pathlen spotbugs unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/fair-scheduler-invalid.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/yarn-site-with-invalid-allocation-file-ref.xml
 

spotbugs :

   module:hadoop-hdfs-project/hadoop-hdfs 
   Redundant nullcheck of oldLock, which is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.DataStorage.isPreUpgradableLayout(Storage$StorageDirectory)
 Redundant null check at DataStorage.java:is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.DataStorage.isPreUpgradableLayout(Storage$StorageDirectory)
 Redundant null check at DataStorage.java:[line 695] 
   Redundant nullcheck of metaChannel, which is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.MappableBlockLoader.verifyChecksum(long,
 FileInputStream, FileChannel, String) Redundant null check at 
MappableBlockLoader.java:is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.MappableBlockLoader.verifyChecksum(long,
 FileInputStream, FileChannel, String) Redundant null check at 
MappableBlockLoader.java:[line 138] 
   Redundant nullcheck of blockChannel, which is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.MemoryMappableBlockLoader.load(long,
 FileInputStream, FileInputStream, String, ExtendedBlockId) Redundant null 
check at MemoryMappableBlockLoader.java:is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.MemoryMappableBlockLoader.load(long,
 FileInputStream, FileInputStream, String, ExtendedBlockId) Redundant null 
check at MemoryMappableBlockLoader.java:[line 75] 
   Redundant nullcheck of blockChannel, which is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.NativePmemMappableBlockLoader.load(long,
 FileInputStream, FileInputStream, String, ExtendedBlockId) Redundant null 
check at NativePmemMappableBlockLoader.java:is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.NativePmemMappableBlockLoader.load(long,
 FileInputStream, FileInputStream, String, ExtendedBlockId) Redundant null 
check at NativePmemMappableBlockLoader.java:[line 85] 
   Redundant nullcheck of metaChannel, which is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.NativePmemMappableBlockLoader.verifyChecksumAndMapBlock(NativeIO$POSIX$PmemMappedRegion,
 long, FileInputStream, FileChannel, String) Redundant null check at 
NativePmemMappableBlockLoader.java:is known to be non-null in 
org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.NativePmemMappableBlockLoader.verifyChecksumAndMapBlock(NativeIO$POSIX$PmemMappedRegion,
 long, FileInputStream, FileChannel, String) Redundant null check at 
NativePmemMappableBlockLoader.java:[line 130] 
   
org.apache.hadoop.hdfs.server.namenode.top.window.RollingWindowManager$UserCounts
 doesn't override java.util.ArrayList.equals(Object) At 
RollingWindowManager.java:At RollingWindowManager.java:[line 1] 

spotbugs :

   module:hadoop-yarn-project/hadoop-yarn 
   Redundant nullcheck of it, which is known to be non-null in 
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.recoverTrackerResources(LocalResourcesTracker,
 NMStateStoreService$LocalResourceTrackerState) Redundant null check at 

[jira] [Created] (HADOOP-18261) Integrating the code into cloudstore

2022-05-25 Thread Sravani Gadey (Jira)
Sravani Gadey created HADOOP-18261:
--

 Summary: Integrating the code into cloudstore
 Key: HADOOP-18261
 URL: https://issues.apache.org/jira/browse/HADOOP-18261
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: fs/s3
Reporter: Sravani Gadey


Converting the key-value pairs into different file formats like CSV, Avro, 
Parquet etc., so that we can make the logs readable and can perform required 
operations easily.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18262) Visualizing the audit logs

2022-05-25 Thread Sravani Gadey (Jira)
Sravani Gadey created HADOOP-18262:
--

 Summary: Visualizing the audit logs
 Key: HADOOP-18262
 URL: https://issues.apache.org/jira/browse/HADOOP-18262
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: fs/s3
Reporter: Sravani Gadey


Visualizing the audit logs on Zeppelin or Jupyter notebook with graphs, which 
help us in better analyzation.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18260) Converting key-value pairs into different file formats

2022-05-25 Thread Sravani Gadey (Jira)
Sravani Gadey created HADOOP-18260:
--

 Summary: Converting key-value pairs into different file formats
 Key: HADOOP-18260
 URL: https://issues.apache.org/jira/browse/HADOOP-18260
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: fs/s3
Reporter: Sravani Gadey


Converting the key-value pairs into different file formats like CSV, Avro, 
Parquet etc., so that we can make the logs readable and can perform required 
operations easily.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18259) Parsing the S3A audit logs into key-value pairs

2022-05-25 Thread Sravani Gadey (Jira)
Sravani Gadey created HADOOP-18259:
--

 Summary: Parsing the S3A audit logs into key-value pairs
 Key: HADOOP-18259
 URL: https://issues.apache.org/jira/browse/HADOOP-18259
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: fs/s3
Reporter: Sravani Gadey


Parsing the audit logs using regular expressions i.e, dividing them into key 
value pairs.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18258) Merging of S3A Audit Logs

2022-05-25 Thread Sravani Gadey (Jira)
Sravani Gadey created HADOOP-18258:
--

 Summary: Merging of S3A Audit Logs
 Key: HADOOP-18258
 URL: https://issues.apache.org/jira/browse/HADOOP-18258
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: fs/s3
Reporter: Sravani Gadey


Merging audit log files containing huge number of audit logs collected from a 
job like Hive or Spark job containing various S3 requests like list, head, get 
and put requests.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18257) Analyzing S3A Audit Logs

2022-05-25 Thread Sravani Gadey (Jira)
Sravani Gadey created HADOOP-18257:
--

 Summary: Analyzing S3A Audit Logs 
 Key: HADOOP-18257
 URL: https://issues.apache.org/jira/browse/HADOOP-18257
 Project: Hadoop Common
  Issue Type: Task
  Components: fs/s3
Reporter: Sravani Gadey


The main aim is to analyze S3A Audit logs to give better insights in Hive and 
Spark jobs.
Steps involved are:
 * Merging audit log files containing huge number of audit logs collected from 
a job containing various S3 requests.
 * Parsing audit logs using regular expressions i.e., dividing them into key 
value pairs.
 * Converting the key value pairs into CSV file and AVRO file formats.
 * Querying on data which would give better insights for different jobs.
 * Visualizing the audit logs on Zeppelin or Jupyter notebook with graphs.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18256) S3A audit log referrer header path contains '&'

2022-05-25 Thread Sravani Gadey (Jira)
Sravani Gadey created HADOOP-18256:
--

 Summary: S3A audit log referrer header path contains '&'
 Key: HADOOP-18256
 URL: https://issues.apache.org/jira/browse/HADOOP-18256
 Project: Hadoop Common
  Issue Type: Bug
  Components: fs/s3
Reporter: Sravani Gadey


{code:java}
"https://audit.example.org/hadoop/1/op_mkdirs/028e8604-c244-44be-9c5f-dd070c3000d6-0180/?op=op_mkdirs=Users/stevel/Projects/hadoop-trunk/hadoop-tools/hadoop-aws/target/test-dir/6/testContextURI/test/hadoop/&*%23$%23$@234=stevel=b73b8f7d-9a44-4259-94c2-da1dd28a7706=028e8604-c244-44be-9c5f-dd070c3000d6-0180=1=028e8604-c244-44be-9c5f-dd070c3000d6=1=1645621486477;
 {code}
here 'p1' shouldn't contain '&', but in some cases it doesn't escape in the 
referrer header



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: branch-2.10+JDK7 on Linux/x86_64

2022-05-25 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/

[May 24, 2022 2:36:20 AM] (Masatake Iwasaki) HADOOP-15261. Upgrade commons-io 
from 2.4 to 2.5. Contributed by PandaMonkey.
[May 24, 2022 4:18:56 AM] (Masatake Iwasaki) YARN-11126. ZKConfigurationStore 
Java deserialisation vulnerability. Contributed by Tamas Domok
[May 24, 2022 4:28:02 AM] (Masatake Iwasaki) Revert "YARN-11126. 
ZKConfigurationStore Java deserialisation vulnerability. Contributed by Tamas 
Domok"
[May 24, 2022 4:53:34 AM] (Masatake Iwasaki) YARN-11126. ZKConfigurationStore 
Java deserialisation vulnerability. Contributed by Tamas Domok
[May 24, 2022 5:32:52 AM] (Masatake Iwasaki) YARN-11162. Set the zk acl for 
nodes created by ZKConfigurationStore. (#4350)




-1 overall


The following subsystems voted -1:
asflicense hadolint mvnsite pathlen unit


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

Failed junit tests :

   hadoop.fs.TestFileUtil 
   hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys 
   
hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithUpgradeDomain 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.hdfs.server.federation.router.TestRouterNamenodeHeartbeat 
   hadoop.hdfs.server.federation.router.TestRouterQuota 
   hadoop.hdfs.server.federation.resolver.TestMultipleDestinationResolver 
   hadoop.hdfs.server.federation.resolver.order.TestLocalResolver 
   hadoop.yarn.server.resourcemanager.TestClientRMService 
   
hadoop.yarn.server.resourcemanager.monitor.invariants.TestMetricsInvariantChecker
 
   hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter 
   hadoop.mapreduce.lib.input.TestLineRecordReader 
   hadoop.mapred.TestLineRecordReader 
   hadoop.mapreduce.v2.app.TestRuntimeEstimators 
   hadoop.tools.TestDistCpSystem 
   hadoop.yarn.sls.TestSLSRunner 
   hadoop.resourceestimator.solver.impl.TestLpSolver 
   hadoop.resourceestimator.service.TestResourceEstimatorService 
  

   cc:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/diff-compile-cc-root.txt
  [4.0K]

   javac:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/diff-compile-javac-root.txt
  [488K]

   checkstyle:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/diff-checkstyle-root.txt
  [14M]

   hadolint:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   mvnsite:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/patch-mvnsite-root.txt
  [568K]

   pathlen:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/diff-patch-pylint.txt
  [20K]

   shellcheck:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/diff-patch-shellcheck.txt
  [72K]

   whitespace:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/whitespace-eol.txt
  [12M]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/whitespace-tabs.txt
  [1.3M]

   javadoc:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/patch-javadoc-root.txt
  [40K]

   unit:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt
  [220K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [428K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs_src_contrib_bkjournal.txt
  [16K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt
  [36K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt
  [20K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/672/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-resourcemanager.txt
  [124K]
   

[jira] [Resolved] (HADOOP-18240) Upgrade Yetus to 0.14.0

2022-05-25 Thread Akira Ajisaka (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18240?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Akira Ajisaka resolved HADOOP-18240.

Fix Version/s: 3.4.0
   3.2.4
   3.3.4
   2.10.3
   Resolution: Fixed

Committed to trunk, branch-3.3, branch-3.2, and branch-2.10.

> Upgrade Yetus to 0.14.0
> ---
>
> Key: HADOOP-18240
> URL: https://issues.apache.org/jira/browse/HADOOP-18240
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Reporter: Akira Ajisaka
>Assignee: Ashutosh Gupta
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0, 3.2.4, 3.3.4, 2.10.3
>
>  Time Spent: 1h
>  Remaining Estimate: 0h
>
> Yetus 0.14.0 is released. Let's upgrade.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17591) Fix the wrong CIDR range example in Proxy User documentation

2022-05-25 Thread Akira Ajisaka (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17591?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Akira Ajisaka resolved HADOOP-17591.

Resolution: Duplicate

Fixed by HADOOP-17952. Closing.

> Fix the wrong CIDR range example in Proxy User documentation
> 
>
> Key: HADOOP-17591
> URL: https://issues.apache.org/jira/browse/HADOOP-17591
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: documentation
>Affects Versions: 3.2.2
>Reporter: Kwangsun Noh
>Priority: Trivial
>  Labels: newbie
>
> The CIDR range example on the Proxy user description page is wrong.
>  
> In the Configurations section of the Proxy user page, CIDR 10.222.0.0/16 
> means range 10.222.0.0-15.
>  
> But It's not true. CIDR format 10.222.0.0/16 means 10.222.0.0-10.222.255.255
>  
> as-is : 10.222.0.0-15
> to-be : 10.222.0.0-10.222.255.255
>  



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org