Apache Hadoop qbt Report: branch2+JDK7 on Linux/x86

2019-09-21 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/

No changes




-1 overall


The following subsystems voted -1:
compile findbugs hadolint mvninstall mvnsite pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/empty-configuration.xml
 
   hadoop-tools/hadoop-azure/src/config/checkstyle-suppressions.xml 
   hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/public/crossdomain.xml 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/src/main/webapp/public/crossdomain.xml
 
  

   mvninstall:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/patch-mvninstall-root.txt
  [332K]

   compile:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/patch-compile-root-jdk1.7.0_95.txt
  [176K]

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/patch-compile-root-jdk1.7.0_95.txt
  [176K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/patch-compile-root-jdk1.7.0_95.txt
  [176K]

   compile:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/patch-compile-root-jdk1.8.0_222.txt
  [112K]

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/patch-compile-root-jdk1.8.0_222.txt
  [112K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/patch-compile-root-jdk1.8.0_222.txt
  [112K]

   checkstyle:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out//testptch/patchprocess/maven-patch-checkstyle-root.txt
  []

   hadolint:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   mvnsite:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/patch-mvnsite-root.txt
  [64K]

   pathlen:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/diff-patch-pylint.txt
  [24K]

   shellcheck:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/diff-patch-shellcheck.txt
  [72K]

   shelldocs:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/diff-patch-shelldocs.txt
  [8.0K]

   whitespace:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/whitespace-eol.txt
  [12M]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/whitespace-tabs.txt
  [1.3M]

   xml:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/xml.txt
  [12K]

   findbugs:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-common-project_hadoop-common.txt
  [8.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-common-project_hadoop-kms.txt
  [8.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-common-project_hadoop-nfs.txt
  [4.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs.txt
  [20K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs-client.txt
  [24K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt
  [8.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs-nfs.txt
  [8.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs-rbf.txt
  [4.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs_src_contrib_bkjournal.txt
  [8.0K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-app.txt
  [44K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/452/artifact/out/branch-findbugs-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-common.txt
  [8.0K]
   

[jira] [Resolved] (HADOOP-16559) [HDFS] use protobuf-maven-plugin to generate protobuf classes

2019-09-21 Thread Duo Zhang (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16559?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Duo Zhang resolved HADOOP-16559.

  Assignee: (was: Duo Zhang)
Resolution: Duplicate

> [HDFS] use protobuf-maven-plugin to generate protobuf classes
> -
>
> Key: HADOOP-16559
> URL: https://issues.apache.org/jira/browse/HADOOP-16559
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Priority: Major
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto file



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16589) [pb-upgrade] Update docker image to make 3.7.1 protoc as default

2019-09-21 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16589?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16589.

Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged to trunk.

> [pb-upgrade] Update docker image to make 3.7.1 protoc as default
> 
>
> Key: HADOOP-16589
> URL: https://issues.apache.org/jira/browse/HADOOP-16589
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: 3.3.0
>
>
> Right now, docker image contains both 2.5.0 protoc and 3.7.1 protoc.
> 2.5.0 is default protoc in PATH.
> After HADOOP-16557, protoc version expected in PATH is 3.7.1. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86

2019-09-21 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1266/

[Sep 20, 2019 4:55:34 AM] (tasanuma) HADOOP-16069. Support configure 
ZK_DTSM_ZK_KERBEROS_PRINCIPAL in
[Sep 20, 2019 5:32:19 AM] (aengineer) HDDS-2156. Fix alignment issues in HDDS 
doc pages
[Sep 20, 2019 5:43:00 AM] (aengineer) HDDS-2020. Remove mTLS from Ozone GRPC. 
Contributed by Xiaoyu Yao.
[Sep 20, 2019 10:38:30 AM] (github) HADOOP-16557. [pb-upgrade] Upgrade 
protobuf.version to 3.7.1 (#1432)
[Sep 20, 2019 4:55:48 PM] (xkrogen) HADOOP-16581. Revise ValueQueue to 
correctly replenish queues that go
[Sep 20, 2019 4:57:08 PM] (aengineer) HDDS-1949. Missing or error-prone test 
cleanup. Contributed by
[Sep 20, 2019 5:03:46 PM] (aengineer) HDDS-2001. Update Ratis version to 0.4.0.
[Sep 20, 2019 6:45:01 PM] (inigoiri) HDFS-14844. Make buffer of
[Sep 20, 2019 9:05:35 PM] (aengineer) HDDS-2157. checkstyle: print filenames 
relative to project root (#1485)
[Sep 20, 2019 9:22:55 PM] (aengineer) HDDS-2128. Make ozone sh command work 
with OM HA service ids (#1445)




-1 overall


The following subsystems voted -1:
asflicense findbugs hadolint pathlen shadedclient unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 

FindBugs :

   
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-applications/hadoop-yarn-applications-mawo/hadoop-yarn-applications-mawo-core
 
   Class org.apache.hadoop.applications.mawo.server.common.TaskStatus 
implements Cloneable but does not define or use clone method At 
TaskStatus.java:does not define or use clone method At TaskStatus.java:[lines 
39-346] 
   Equals method for 
org.apache.hadoop.applications.mawo.server.worker.WorkerId assumes the argument 
is of type WorkerId At WorkerId.java:the argument is of type WorkerId At 
WorkerId.java:[line 114] 
   
org.apache.hadoop.applications.mawo.server.worker.WorkerId.equals(Object) does 
not check for null argument At WorkerId.java:null argument At 
WorkerId.java:[lines 114-115] 

Failed junit tests :

   hadoop.hdfs.tools.TestDFSZKFailoverController 
   
hadoop.yarn.server.resourcemanager.scheduler.fair.TestFairSchedulerPreemption 
   hadoop.yarn.server.timelineservice.storage.TestTimelineWriterHBaseDown 
   hadoop.yarn.server.timelineservice.storage.TestTimelineReaderHBaseDown 
   hadoop.yarn.server.timelineservice.storage.TestHBaseTimelineStorageApps 
   
hadoop.yarn.server.timelineservice.reader.TestTimelineReaderWebServicesHBaseStorage
 
   
hadoop.yarn.server.timelineservice.storage.flow.TestHBaseStorageFlowActivity 
   
hadoop.yarn.server.timelineservice.storage.TestHBaseTimelineStorageDomain 
   
hadoop.yarn.server.timelineservice.storage.flow.TestHBaseStorageFlowRunCompaction
 
   hadoop.yarn.server.timelineservice.storage.flow.TestHBaseStorageFlowRun 
   
hadoop.yarn.server.timelineservice.storage.TestHBaseTimelineStorageEntities 
   
hadoop.yarn.server.timelineservice.storage.TestHBaseTimelineStorageSchema 
   hadoop.yarn.sls.appmaster.TestAMSimulator 
   hadoop.fs.adl.live.TestAdlSdkConfiguration 
  

   cc:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1266/artifact/out/diff-compile-cc-root.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1266/artifact/out/diff-compile-javac-root.txt
  [420K]

   checkstyle:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1266/artifact/out/diff-checkstyle-root.txt
  [17M]

   hadolint:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1266/artifact/out/diff-patch-hadolint.txt
  [8.0K]

   pathlen:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1266/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1266/artifact/out/diff-patch-pylint.txt
  [220K]

   shellcheck:

   
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1266/artifact/out/diff-patch-shellcheck.txt
  [24K]

   shelldocs:

   

Re: [DISCUSS] Separate Hadoop Core trunk and Hadoop Ozone trunk source tree

2019-09-21 Thread Wanqiang Ji
+1 (non-binding)

BR,
Wanqiang Ji

On Tue, Sep 17, 2019 at 5:48 PM Elek, Marton  wrote:

>
>
> TLDR; I propose to move Ozone related code out from Hadoop trunk and
> store it in a separated *Hadoop* git repository apache/hadoop-ozone.git
>
>
>
>
> When Ozone was adopted as a new Hadoop subproject it was proposed[1] to
> be part of the source tree but with separated release cadence, mainly
> because it had the hadoop-trunk/SNAPSHOT as compile time dependency.
>
> During the last Ozone releases this dependency is removed to provide
> more stable releases. Instead of using the latest trunk/SNAPSHOT build
> from Hadoop, Ozone uses the latest stable Hadoop (3.2.0 as of now).
>
> As we have no more strict dependency between Hadoop trunk SNAPSHOT and
> Ozone trunk I propose to separate the two code base from each other with
> creating a new Hadoop git repository (apache/hadoop-ozone.git):
>
> With moving Ozone to a separated git repository:
>
>   * It would be easier to contribute and understand the build (as of now
> we always need `-f pom.ozone.xml` as a Maven parameter)
>   * It would be possible to adjust build process without breaking
> Hadoop/Ozone builds.
>   * It would be possible to use different Readme/.asf.yaml/github
> template for the Hadoop Ozone and core Hadoop. (For example the current
> github template [2] has a link to the contribution guideline [3]. Ozone
> has an extended version [4] from this guideline with additional
> information.)
>   * Testing would be more safe as it won't be possible to change core
> Hadoop and Hadoop Ozone in the same patch.
>   * It would be easier to cut branches for Hadoop releases (based on the
> original consensus, Ozone should be removed from all the release
> branches after creating relase branches from trunk)
>
>
> What do you think?
>
> Thanks,
> Marton
>
> [1]:
>
> https://lists.apache.org/thread.html/c85e5263dcc0ca1d13cbbe3bcfb53236784a39111b8c353f60582eb4@%3Chdfs-dev.hadoop.apache.org%3E
> [2]:
>
> https://github.com/apache/hadoop/blob/trunk/.github/pull_request_template.md
> [3]: https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
> [4]:
>
> https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute+to+Ozone
>
> -
> To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
> For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org
>
>


[jira] [Created] (HADOOP-16592) Build fails as can't retrieve websocket-server-impl

2019-09-21 Thread Jira
Erkin Alp Güney created HADOOP-16592:


 Summary: Build fails as can't retrieve websocket-server-impl
 Key: HADOOP-16592
 URL: https://issues.apache.org/jira/browse/HADOOP-16592
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Erkin Alp Güney


[ERROR] Failed to execute goal on project hadoop-yarn-server-nodemanager: Could 
not resolve dependencies for project 
org.apache.hadoop:hadoop-yarn-server-nodemanager:jar:3.3.0-SNAPSHOT: The 
following artifacts could not be resolved: 
org.eclipse.jetty.websocket:javax-websocket-server-impl:jar:9.3.27.v20190418, 
org.eclipse.jetty:jetty-annotations:jar:9.3.27.v20190418, 
org.eclipse.jetty:jetty-plus:jar:9.3.27.v20190418, 
org.eclipse.jetty:jetty-jndi:jar:9.3.27.v20190418, 
org.eclipse.jetty.websocket:javax-websocket-client-impl:jar:9.3.27.v20190418, 
org.eclipse.jetty.websocket:websocket-client:jar:9.3.27.v20190418, 
org.eclipse.jetty.websocket:websocket-server:jar:9.3.27.v20190418, 
org.eclipse.jetty.websocket:websocket-common:jar:9.3.27.v20190418, 
org.eclipse.jetty.websocket:websocket-api:jar:9.3.27.v20190418, 
org.eclipse.jetty.websocket:websocket-servlet:jar:9.3.27.v20190418: Could not 
transfer artifact 
org.eclipse.jetty.websocket:javax-websocket-server-impl:jar:9.3.27.v20190418 
from/to apache.snapshots.https 
(https://repository.apache.org/content/repositories/snapshots): 
repository.apache.org: Unknown host repository.apache.org -> [Help 1]
Again, the same as HADOOP-16577, but this time with websocket-server-impl.




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16577) Build fails as can't retrieve websocket-servlet

2019-09-21 Thread Jira


 [ 
https://issues.apache.org/jira/browse/HADOOP-16577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Erkin Alp Güney resolved HADOOP-16577.
--
Resolution: Done

> Build fails as can't retrieve websocket-servlet
> ---
>
> Key: HADOOP-16577
> URL: https://issues.apache.org/jira/browse/HADOOP-16577
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.3.0
>Reporter: Erkin Alp Güney
>Priority: Blocker
>  Labels: dependencies
>
> I encountered this error when building Hadoop:
> Downloading: 
> https://repository.apache.org/content/repositories/snapshots/org/eclipse/jetty/websocket/websocket-server/9.3.27.v20190418/websocket-server-9.3.27.v20190418.jar
> Sep 15, 2019 7:54:39 AM 
> org.apache.maven.wagon.providers.http.httpclient.impl.execchain.RetryExec 
> execute
> INFO: I/O exception 
> (org.apache.maven.wagon.providers.http.httpclient.NoHttpResponseException) 
> caught when processing request to {s}->https://repository.apache.org:443: The 
> target server failed to respond
> Sep 15, 2019 7:54:39 AM 
> org.apache.maven.wagon.providers.http.httpclient.impl.execchain.RetryExec 
> execute



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org