[jira] [Created] (HADOOP-16815) Fix help messsage for yarn rmadmin

2020-01-19 Thread Xieming Li (Jira)
Xieming Li created HADOOP-16815:
---

 Summary: Fix help messsage for yarn rmadmin
 Key: HADOOP-16815
 URL: https://issues.apache.org/jira/browse/HADOOP-16815
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Xieming Li
Assignee: Xieming Li


(This issue is identified by [~aajisaka] in 
https://issues.apache.org/jira/browse/HADOOP-16753)

The help message of yarn rmadmin seems broken.

Current: 

 
{code:java}
$ yarn rmadmin -help refreshNodes 2>/dev/null
$
$ yarn rmadmin -help refreshNodes
Usage: yarn rmadmin
   -refreshQueues
   -refreshNodes [-g|graceful [timeout in seconds] -client|server]
   -refreshNodesResources
   -refreshSuperUserGroupsConfiguration
   -refreshUserToGroupsMappings
   -refreshAdminAcls
   -refreshServiceAcl
   -getGroups [username]
   -addToClusterNodeLabels 
<"label1(exclusive=true),label2(exclusive=false),label3">
   -removeFromClusterNodeLabels  (label splitted by ",")
   -replaceLabelsOnNode <"node1[:port]=label1,label2 
node2[:port]=label1,label2"> [-failOnUnknownNodes]
   -directlyAccessNodeLabelStore
   -refreshClusterMaxPriority
   -updateNodeResource [NodeID] [MemSize] [vCores] ([OvercommitTimeout])
or
[NodeID] [resourcetypes] ([OvercommitTimeout]).
   -help [cmd]Generic options supported are:
-conf specify an application configuration file
-Ddefine a value for a given property
-fs  specify default filesystem URL to use, 
overrides 'fs.defaultFS' property from configurations.
-jt   specify a ResourceManager
-files specify a comma-separated list of files to be 
copied to the map reduce cluster
-libjarsspecify a comma-separated list of jar files 
to be included in the classpath
-archives   specify a comma-separated list of archives to 
be unarchived on the compute machinesThe general command line syntax is:
command [genericOptions] [commandOptions]
{code}
Expected: 

 

 
{code:java}
$ yarn rmadmin -help refreshNodes 2>/dev/null
 -refreshNodes [-g|graceful [timeout in seconds] -client|server]

$ yarn rmadmin -help refreshNodes
 -refreshNodes [-g|graceful [timeout in seconds] -client|server]
{code}
 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16814) Add dropped connections metric for Server

2020-01-19 Thread Fei Hui (Jira)
Fei Hui created HADOOP-16814:


 Summary: Add dropped connections metric for Server
 Key: HADOOP-16814
 URL: https://issues.apache.org/jira/browse/HADOOP-16814
 Project: Hadoop Common
  Issue Type: Test
  Components: common
Affects Versions: 3.3.0
Reporter: Fei Hui
Assignee: Fei Hui


With this metric we can see that the number of handled rpcs which weren't sent 
to clients.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16813) Unable to build 3.1.3 on Win 10 - missing native header files. Error (compile-ms-native-dll) on project hadoop-common

2020-01-19 Thread mark pettovello (Jira)
mark pettovello created HADOOP-16813:


 Summary: Unable to build 3.1.3 on Win 10 - missing native header 
files.  Error (compile-ms-native-dll) on project hadoop-common
 Key: HADOOP-16813
 URL: https://issues.apache.org/jira/browse/HADOOP-16813
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.1.3, 3.2.1
 Environment: VS 2019 (v142), x64, Windows 10
Reporter: mark pettovello


Maven build Error (compile-ms-native-dll) on project hadoop-common

mvn package -Pdist,native -DskipTests -Dmaven.javadoc.skip=true

Missing numerous native C file headers, such as:

org_apache_hadoop_io_compress_lz4_Lz4Compressor.h

org_apache_hadoop_io_compress_lz4_Lz4Decompressor.h

org_apache_hadoop_io_nativeio_NativeIO.h

org_apache_hadoop_io_nativeio_NativeIO_POSIX.h

org_apache_hadoop_security_JniBasedUnixGroupsMapping.h

org_apache_hadoop_util_NativeCrc32.h

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16812) Unable to build 3.1.3 winutils and libwinutils native x64 using VisualStudio 2019: winutils.sln, Platform Toolset v142, SDK v10.0, C++ LangStd: Default, and other defau

2020-01-19 Thread mark pettovello (Jira)
mark pettovello created HADOOP-16812:


 Summary: Unable to build 3.1.3 winutils and libwinutils native x64 
using VisualStudio 2019: winutils.sln, Platform Toolset v142, SDK v10.0, C++ 
LangStd: Default, and other default settings
 Key: HADOOP-16812
 URL: https://issues.apache.org/jira/browse/HADOOP-16812
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.1.3, 3.2.1
Reporter: mark pettovello


libwinutils.c  line 40 gives several build errors:

C2065  'L': undeclared identifier

E0065 expected a ';'

E0020 identifier "L" is undefined

C2099 initializer is not a constant

C2143 syntax error: missing ';' before 'string'

 

 

 

  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86

2020-01-19 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1385/

No changes




-1 overall


The following subsystems voted -1:
asflicense findbugs pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/fair-scheduler-invalid.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/yarn-site-with-invalid-allocation-file-ref.xml
 

FindBugs :

   
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-mawo/hadoop-yarn-applications-mawo-core
 
   Class org.apache.hadoop.applications.mawo.server.common.TaskStatus 
implements Cloneable but does not define or use clone method At 
TaskStatus.java:does not define or use clone method At TaskStatus.java:[lines 
39-346] 
   Equals method for 
org.apache.hadoop.applications.mawo.server.worker.WorkerId assumes the argument 
is of type WorkerId At WorkerId.java:the argument is of type WorkerId At 
WorkerId.java:[line 114] 
   
org.apache.hadoop.applications.mawo.server.worker.WorkerId.equals(Object) does 
not check for null argument At WorkerId.java:null argument At 
WorkerId.java:[lines 114-115] 

FindBugs :

   module:hadoop-cloud-storage-project/hadoop-cos 
   Redundant nullcheck of dir, which is known to be non-null in 
org.apache.hadoop.fs.cosn.BufferPool.createDir(String) Redundant null check at 
BufferPool.java:is known to be non-null in 
org.apache.hadoop.fs.cosn.BufferPool.createDir(String) Redundant null check at 
BufferPool.java:[line 66] 
   org.apache.hadoop.fs.cosn.CosNInputStream$ReadBuffer.getBuffer() may 
expose internal representation by returning CosNInputStream$ReadBuffer.buffer 
At CosNInputStream.java:by returning CosNInputStream$ReadBuffer.buffer At 
CosNInputStream.java:[line 87] 
   Found reliance on default encoding in 
org.apache.hadoop.fs.cosn.CosNativeFileSystemStore.storeFile(String, File, 
byte[]):in org.apache.hadoop.fs.cosn.CosNativeFileSystemStore.storeFile(String, 
File, byte[]): new String(byte[]) At CosNativeFileSystemStore.java:[line 199] 
   Found reliance on default encoding in 
org.apache.hadoop.fs.cosn.CosNativeFileSystemStore.storeFileWithRetry(String, 
InputStream, byte[], long):in 
org.apache.hadoop.fs.cosn.CosNativeFileSystemStore.storeFileWithRetry(String, 
InputStream, byte[], long): new String(byte[]) At 
CosNativeFileSystemStore.java:[line 178] 
   org.apache.hadoop.fs.cosn.CosNativeFileSystemStore.uploadPart(File, 
String, String, int) may fail to clean up java.io.InputStream Obligation to 
clean up resource created at CosNativeFileSystemStore.java:fail to clean up 
java.io.InputStream Obligation to clean up resource created at 
CosNativeFileSystemStore.java:[line 252] is not discharged 

Failed junit tests :

   hadoop.hdfs.server.namenode.ha.TestBootstrapAliasmap 
   hadoop.hdfs.TestReconstructStripedFile 
   hadoop.hdfs.server.namenode.TestRedudantBlocks 
   hadoop.hdfs.TestDeadNodeDetection 
   
hadoop.yarn.server.resourcemanager.reservation.TestCapacityOverTimePolicy 
   
hadoop.yarn.server.timelineservice.storage.TestHBaseTimelineStorageDomain 
   hadoop.yarn.server.timelineservice.storage.flow.TestHBaseStorageFlowRun 
   
hadoop.yarn.server.timelineservice.storage.flow.TestHBaseStorageFlowActivity 
   hadoop.yarn.server.timelineservice.storage.TestHBaseTimelineStorageApps 
   hadoop.yarn.server.timelineservice.storage.TestTimelineWriterHBaseDown 
   
hadoop.yarn.server.timelineservice.storage.flow.TestHBaseStorageFlowRunCompaction
 
   
hadoop.yarn.server.timelineservice.reader.TestTimelineReaderWebServicesHBaseStorage
 
   
hadoop.yarn.server.timelineservice.storage.TestHBaseTimelineStorageSchema 
   
hadoop.yarn.server.timelineservice.storage.TestHBaseTimelineStorageEntities 
   hadoop.yarn.server.timelineservice.storage.TestTimelineReaderHBaseDown 
   hadoop.yarn.applications.distributedshell.TestDistributedShell 
   

Apache Hadoop qbt Report: branch2.10+JDK7 on Linux/x86

2020-01-19 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/

No changes




-1 overall


The following subsystems voted -1:
asflicense findbugs hadolint pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/empty-configuration.xml
 
   hadoop-tools/hadoop-azure/src/config/checkstyle-suppressions.xml 
   hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/public/crossdomain.xml 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/src/main/webapp/public/crossdomain.xml
 

FindBugs :

   
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client
 
   Boxed value is unboxed and then immediately reboxed in 
org.apache.hadoop.yarn.server.timelineservice.storage.common.ColumnRWHelper.readResultsWithTimestamps(Result,
 byte[], byte[], KeyConverter, ValueConverter, boolean) At 
ColumnRWHelper.java:then immediately reboxed in 
org.apache.hadoop.yarn.server.timelineservice.storage.common.ColumnRWHelper.readResultsWithTimestamps(Result,
 byte[], byte[], KeyConverter, ValueConverter, boolean) At 
ColumnRWHelper.java:[line 335] 

Failed junit tests :

   hadoop.hdfs.server.balancer.TestBalancerRPCDelay 
   hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys 
   hadoop.hdfs.TestSecureEncryptionZoneWithKMS 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.registry.secure.TestSecureLogins 
   hadoop.yarn.server.timelineservice.security.TestTimelineAuthFilterForV2 
  

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-compile-cc-root-jdk1.7.0_95.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-compile-javac-root-jdk1.7.0_95.txt
  [328K]

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-compile-cc-root-jdk1.8.0_232.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-compile-javac-root-jdk1.8.0_232.txt
  [308K]

   checkstyle:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-checkstyle-root.txt
  [16M]

   hadolint:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   pathlen:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-patch-pylint.txt
  [24K]

   shellcheck:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-patch-shellcheck.txt
  [56K]

   shelldocs:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-patch-shelldocs.txt
  [8.0K]

   whitespace:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/whitespace-eol.txt
  [12M]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/whitespace-tabs.txt
  [1.3M]

   xml:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/xml.txt
  [12K]

   findbugs:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-timelineservice-hbase_hadoop-yarn-server-timelineservice-hbase-client-warnings.html
  [8.0K]

   javadoc:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-javadoc-javadoc-root-jdk1.7.0_95.txt
  [16K]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/diff-javadoc-javadoc-root-jdk1.8.0_232.txt
  [1.1M]

   unit:

   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [244K]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs_src_contrib_bkjournal.txt
  [12K]
   
https://builds.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86/571/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-jobclient.txt
  [96K]