Apache Hadoop qbt Report: trunk+JDK11 on Linux/x86_64

2022-11-27 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/403/

[Nov 26, 2022, 3:31:07 PM] (noreply) YARN-11381. Fix hadoop-yarn-common module 
Java Doc Errors. (#5153). Contributed by Shilun Fan.

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Re: Code coverage report on github PRs

2022-11-27 Thread Wilfred Spiegelenburg
Ayush,

For YuniKorn we have codecov working. We did have to make some minor
changes to get that done.
It is all github action driven [1]. The action is approved for Apache repos.

The first thing is that we asked INFRA to get the codecov app added to our
repos [2].
Beside that to get codecov working for our go code it was pretty straight
forward. In the build for the repositories we generate code coverage data
using the standard go tool and write it to a file. I am not sure how to
generate the test reports from the java, but codecov has some examples for
that.

The codecov action picks up the file, uploads it and gets it processed. To
see the PR code coverage changes we run this for PRs and commit to our
master branch [3]. Codecov automatically generates the reports based on
those two points.

Wilfred

[1] https://github.com/codecov/codecov-action
[2] https://issues.apache.org/jira/browse/INFRA-20641
[3]
https://github.com/apache/yunikorn-core/blob/master/.github/workflows/main.yml


On Fri, 25 Nov 2022 at 00:44, Ayush Saxena  wrote:

> I tried to explore a bit more about codecov and tried to set that up in my
> local fork by allowing access and adding the Github Action in the yaml
> file(& lots of follow up fixes), I think it needs the tests as well to be
> executed as part of the github workflow, which we don't do, or there is
> some catch which I missed, need to further explore.
>
> Reading the doc[1], if not via github action it requires some token(Step:
> 2), I quickly went through the archives and found a ticket regarding the
> same by Spark folks in the past[2], guess they couldn't get that sorted.
>
> Need to explore a bit more, or get pointers from folks who have more
> experience around this.
>
> -Ayush
>
> [1] https://docs.codecov.com/docs
> [2] https://issues.apache.org/jira/browse/INFRA-12640
>
> On Thu, 24 Nov 2022 at 02:07, Wei-Chiu Chuang 
> wrote:
>
> > I believe most of them can be added by us using GitHub Workflow. There's
> a
> > marketplace for these tools and most of them are free for open source
> > projects.
> >
> > On Wed, Nov 23, 2022 at 11:43 AM Ayush Saxena 
> wrote:
> >
> >> A simple Infra ticket I suppose should get it done for us, eg.
> >> https://issues.apache.org/jira/browse/INFRA-23561
> >>
> >> -Ayush
> >>
> >> On Thu, 24 Nov 2022 at 01:00, Iñigo Goiri  wrote:
> >>
> >> > Now that we are using mostly GitHub PRs for the reviews and we have
> >> decent
> >> > integration for the builds etc there, I was wondering about code
> >> coverage
> >> > and reporting.
> >> > Is code coverage setup at all?
> >> > Does this come from the INFRA team?
> >> > What would it take to enable it otherwise?
> >> >
> >>
> >
>


Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86_64

2022-11-27 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1057/

[Nov 26, 2022, 3:31:07 PM] (noreply) YARN-11381. Fix hadoop-yarn-common module 
Java Doc Errors. (#5153). Contributed by Shilun Fan.




-1 overall


The following subsystems voted -1:
blanks hadolint pathlen spotbugs unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-common-project/hadoop-common/src/test/resources/xml/external-dtd.xml 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-excerpt.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-output-missing-tags2.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources/nvidia-smi-sample-output.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/fair-scheduler-invalid.xml
 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/src/test/resources/yarn-site-with-invalid-allocation-file-ref.xml
 

spotbugs :

   module:hadoop-mapreduce-project/hadoop-mapreduce-client 
   Write to static field 
org.apache.hadoop.mapreduce.task.reduce.Fetcher.nextId from instance method new 
org.apache.hadoop.mapreduce.task.reduce.Fetcher(JobConf, TaskAttemptID, 
ShuffleSchedulerImpl, MergeManager, Reporter, ShuffleClientMetrics, 
ExceptionReporter, SecretKey) At Fetcher.java:from instance method new 
org.apache.hadoop.mapreduce.task.reduce.Fetcher(JobConf, TaskAttemptID, 
ShuffleSchedulerImpl, MergeManager, Reporter, ShuffleClientMetrics, 
ExceptionReporter, SecretKey) At Fetcher.java:[line 120] 

spotbugs :

   
module:hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core
 
   Write to static field 
org.apache.hadoop.mapreduce.task.reduce.Fetcher.nextId from instance method new 
org.apache.hadoop.mapreduce.task.reduce.Fetcher(JobConf, TaskAttemptID, 
ShuffleSchedulerImpl, MergeManager, Reporter, ShuffleClientMetrics, 
ExceptionReporter, SecretKey) At Fetcher.java:from instance method new 
org.apache.hadoop.mapreduce.task.reduce.Fetcher(JobConf, TaskAttemptID, 
ShuffleSchedulerImpl, MergeManager, Reporter, ShuffleClientMetrics, 
ExceptionReporter, SecretKey) At Fetcher.java:[line 120] 

spotbugs :

   module:hadoop-mapreduce-project 
   Write to static field 
org.apache.hadoop.mapreduce.task.reduce.Fetcher.nextId from instance method new 
org.apache.hadoop.mapreduce.task.reduce.Fetcher(JobConf, TaskAttemptID, 
ShuffleSchedulerImpl, MergeManager, Reporter, ShuffleClientMetrics, 
ExceptionReporter, SecretKey) At Fetcher.java:from instance method new 
org.apache.hadoop.mapreduce.task.reduce.Fetcher(JobConf, TaskAttemptID, 
ShuffleSchedulerImpl, MergeManager, Reporter, ShuffleClientMetrics, 
ExceptionReporter, SecretKey) At Fetcher.java:[line 120] 

spotbugs :

   module:root 
   Write to static field 
org.apache.hadoop.mapreduce.task.reduce.Fetcher.nextId from instance method new 
org.apache.hadoop.mapreduce.task.reduce.Fetcher(JobConf, TaskAttemptID, 
ShuffleSchedulerImpl, MergeManager, Reporter, ShuffleClientMetrics, 
ExceptionReporter, SecretKey) At Fetcher.java:from instance method new 
org.apache.hadoop.mapreduce.task.reduce.Fetcher(JobConf, TaskAttemptID, 
ShuffleSchedulerImpl, MergeManager, Reporter, ShuffleClientMetrics, 
ExceptionReporter, SecretKey) At Fetcher.java:[line 120] 

Failed junit tests :

   hadoop.hdfs.TestLeaseRecovery2 
   hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts 
   hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs 
   hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf 
   hadoop.mapreduce.v2.app.webapp.TestAMWebServices 
   hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks 
   hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobConf 
   hadoop.mapreduce.v2.hs.webapp.TestHsWebServices 
   hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesLogs 
   hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesTasks 
   hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs 
   hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobsQuery 
   hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAttempts 
   
hadoop.hdfs.server.federation.router.TestRouterRPCMultipleDestinationMountTableResolver
 
  

   cc:

  

[jira] [Created] (HADOOP-18541) Upgrade grizzly version to 2.4.4

2022-11-27 Thread D M Murali Krishna Reddy (Jira)
D M Murali Krishna Reddy created HADOOP-18541:
-

 Summary: Upgrade grizzly version to 2.4.4
 Key: HADOOP-18541
 URL: https://issues.apache.org/jira/browse/HADOOP-18541
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: D M Murali Krishna Reddy


Upgrade grizzly version to 2.4.4 to resolve
|[[sonatype-2016-0415] CWE-79: Improper Neutralization of Input During Web Page 
Generation ('Cross-site 
Scripting')|https://ossindex.sonatype.org/vulnerability/sonatype-2016-0415?component-type=maven=org.glassfish.grizzly/grizzly-http-server]|

[CVE-2014-0099|https://nvd.nist.gov/vuln/detail/CVE-2014-0099], 
[CVE-2014-0075|https://nvd.nist.gov/vuln/detail/CVE-2014-0075], 
[CVE-2017-128|https://nvd.nist.gov/vuln/detail/CVE-2017-128]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18540) Upgrade Bouncy Castle to 1.70

2022-11-27 Thread D M Murali Krishna Reddy (Jira)
D M Murali Krishna Reddy created HADOOP-18540:
-

 Summary: Upgrade Bouncy Castle to 1.70
 Key: HADOOP-18540
 URL: https://issues.apache.org/jira/browse/HADOOP-18540
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: D M Murali Krishna Reddy


Upgrade Bouncycastle to 1.70 to resolve

 
|[[sonatype-2021-4916] CWE-327: Use of a Broken or Risky Cryptographic 
Algorithm|https://ossindex.sonatype.org/vulnerability/sonatype-2021-4916?component-type=maven=org.bouncycastle/bcprov-jdk15on]|

|[[sonatype-2019-0673] CWE-400: Uncontrolled Resource Consumption ('Resource 
Exhaustion')|https://ossindex.sonatype.org/vulnerability/sonatype-2019-0673?component-type=maven=org.bouncycastle/bcprov-jdk15on]|



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18539) Upgrade ojalgo to 51.4.1

2022-11-27 Thread D M Murali Krishna Reddy (Jira)
D M Murali Krishna Reddy created HADOOP-18539:
-

 Summary: Upgrade ojalgo to  51.4.1
 Key: HADOOP-18539
 URL: https://issues.apache.org/jira/browse/HADOOP-18539
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: D M Murali Krishna Reddy


Upgrade ojalgo to  51.4.1 to resolve CWE-327: [Use of a Broken or Risky 
Cryptographic Algorithm|https://cwe.mitre.org/data/definitions/327.html] 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-18538) Upgrade kafka to 2.8.2

2022-11-27 Thread D M Murali Krishna Reddy (Jira)
D M Murali Krishna Reddy created HADOOP-18538:
-

 Summary: Upgrade kafka to 2.8.2
 Key: HADOOP-18538
 URL: https://issues.apache.org/jira/browse/HADOOP-18538
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: D M Murali Krishna Reddy


Upgrade kafka to 2.8.2 to resolve 
[CVE-2022-34917|https://nvd.nist.gov/vuln/detail/CVE-2022-34917] 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: branch-2.10+JDK7 on Linux/x86_64

2022-11-27 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/

No changes




-1 overall


The following subsystems voted -1:
asflicense hadolint mvnsite pathlen unit


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

Failed junit tests :

   hadoop.fs.TestTrash 
   hadoop.fs.TestFileUtil 
   hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys 
   hadoop.hdfs.server.namenode.snapshot.TestSnapshotDeletion 
   
hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithUpgradeDomain 
   hadoop.hdfs.server.namenode.ha.TestPipelinesFailover 
   hadoop.hdfs.TestLeaseRecovery2 
   hadoop.hdfs.TestDFSInotifyEventInputStream 
   hadoop.hdfs.TestFileLengthOnClusterRestart 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.hdfs.server.federation.router.TestRouterNamenodeHeartbeat 
   hadoop.hdfs.server.federation.router.TestRouterQuota 
   hadoop.hdfs.server.federation.resolver.TestMultipleDestinationResolver 
   hadoop.hdfs.server.federation.resolver.order.TestLocalResolver 
   
hadoop.yarn.server.nodemanager.containermanager.linux.resources.TestNumaResourceAllocator
 
   
hadoop.yarn.server.nodemanager.containermanager.linux.resources.TestNumaResourceHandlerImpl
 
   hadoop.yarn.server.resourcemanager.recovery.TestFSRMStateStore 
   hadoop.yarn.server.resourcemanager.TestClientRMService 
   
hadoop.yarn.server.resourcemanager.monitor.invariants.TestMetricsInvariantChecker
 
   hadoop.yarn.applications.distributedshell.TestDistributedShell 
   hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter 
   hadoop.mapreduce.lib.input.TestLineRecordReader 
   hadoop.mapred.TestLineRecordReader 
   hadoop.yarn.sls.TestSLSRunner 
   hadoop.resourceestimator.solver.impl.TestLpSolver 
   hadoop.resourceestimator.service.TestResourceEstimatorService 
  

   cc:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/diff-compile-cc-root.txt
  [4.0K]

   javac:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/diff-compile-javac-root.txt
  [488K]

   checkstyle:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/diff-checkstyle-root.txt
  [14M]

   hadolint:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   mvnsite:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/patch-mvnsite-root.txt
  [572K]

   pathlen:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/diff-patch-pylint.txt
  [20K]

   shellcheck:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/diff-patch-shellcheck.txt
  [72K]

   whitespace:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/whitespace-eol.txt
  [12M]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/whitespace-tabs.txt
  [1.3M]

   javadoc:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/patch-javadoc-root.txt
  [40K]

   unit:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt
  [224K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [464K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs_src_contrib_bkjournal.txt
  [16K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt
  [36K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common.txt
  [20K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/858/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager.txt
  [72K]