Apache Hadoop qbt Report: trunk+JDK11 on Linux/x86_64

2024-02-13 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java11-linux-x86_64/629/

[Feb 13, 2024, 1:49:39 AM] (github) HDFS-17362. RBF: Implement 
RouterObserverReadConfiguredFailoverProxyProvider (#6510)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Re: build q: getting the full DAG or all maven dependencies

2024-02-13 Thread Ayush Saxena
Does hadoop-dist package help? It does the packaging stuff for hadoop, IIRC
it defines all the projects so that the dist kicks post everything is built
[1], it has scripts mentioned in the pom which does the packaging work.

for the protobuf, I think in the yarn modules we don't have the scope
defined, so by default it is taking as compile, maybe putting the scope in
the parent pom [2] should help, I tried locally & it did, else you need to
define the scope thing in every POM which uses protobuf...


-Ayush

[1] https://github.com/apache/hadoop/blob/trunk/hadoop-dist/pom.xml#L32
[2]
https://github.com/apache/hadoop/commit/c373d3fa39013e8a8f6a6122c3ca230b4aa10abe


On Tue, 13 Feb 2024 at 23:37, Steve Loughran 
wrote:

> it does, but i'm not sure if there is a single module where you can ask for
> it and get the full list.
>
> For that verification project I've got I may declare more poms as
> dependencies so can do the aggregate scan there. this would also let me run
> maven dependency -verbose, save the output to a file and see what is there.
> would let us define lists of libraries we don't want in distributions
>
> On Mon, 12 Feb 2024 at 18:03, Sangjin Lee  wrote:
>
> > Does the maven dependency plugin help? I might try mvn dependency:tree
> and
> > see if it takes you somewhere.
> >
> > Sangjin
> >
> >
> > On Mon, Feb 12, 2024 at 9:50 AM Steve Loughran
>  > >
> > wrote:
> >
> > > how can we work out the entire DAG of dependencies in a hadoop distro?
> > >
> > > I'm asking as there are things in 3.4.0 that we shouldn't need
> (protobuf
> > > 2.5), and when I add the pR to move off log4j 1.17 to reload4j, I still
> > > find one in the yarn timeline lib dir
> > > https://github.com/apache/hadoop/pull/6547
> > >
> > >
> > > see HADOOP-19074 for a list of what is in 3.4.0 RC0, which predates the
> > new
> > > shaded jar.
> > >
> >
>


Re: build q: getting the full DAG or all maven dependencies

2024-02-13 Thread Sangjin Lee
I tried running it at the root project (hadoop), and got a meaningful
dependency tree. It does print an exhaustive and transitive tree of
dependencies.

As for log4j with your patch, I see two ways log4j is introduced:
- log4j -> hadoop-common@2.8.5 ->
hadoop-yarn-server-timelineservice-hbase-tests
- log4j -> solr:slor-core@8.11.2 ->
hadoop-yarn-applications-catalog-webapp:war

That said, both are test-scoped. I'm not sure why we're packaging test-only
dependencies into the hadoop distro. Is it a known thing?

Sangjin


On Tue, Feb 13, 2024 at 10:07 AM Steve Loughran 
wrote:

> it does, but i'm not sure if there is a single module where you can ask for
> it and get the full list.
>
> For that verification project I've got I may declare more poms as
> dependencies so can do the aggregate scan there. this would also let me run
> maven dependency -verbose, save the output to a file and see what is there.
> would let us define lists of libraries we don't want in distributions
>
> On Mon, 12 Feb 2024 at 18:03, Sangjin Lee  wrote:
>
> > Does the maven dependency plugin help? I might try mvn dependency:tree
> and
> > see if it takes you somewhere.
> >
> > Sangjin
> >
> >
> > On Mon, Feb 12, 2024 at 9:50 AM Steve Loughran
>  > >
> > wrote:
> >
> > > how can we work out the entire DAG of dependencies in a hadoop distro?
> > >
> > > I'm asking as there are things in 3.4.0 that we shouldn't need
> (protobuf
> > > 2.5), and when I add the pR to move off log4j 1.17 to reload4j, I still
> > > find one in the yarn timeline lib dir
> > > https://github.com/apache/hadoop/pull/6547
> > >
> > >
> > > see HADOOP-19074 for a list of what is in 3.4.0 RC0, which predates the
> > new
> > > shaded jar.
> > >
> >
>


Re: build q: getting the full DAG or all maven dependencies

2024-02-13 Thread Steve Loughran
it does, but i'm not sure if there is a single module where you can ask for
it and get the full list.

For that verification project I've got I may declare more poms as
dependencies so can do the aggregate scan there. this would also let me run
maven dependency -verbose, save the output to a file and see what is there.
would let us define lists of libraries we don't want in distributions

On Mon, 12 Feb 2024 at 18:03, Sangjin Lee  wrote:

> Does the maven dependency plugin help? I might try mvn dependency:tree and
> see if it takes you somewhere.
>
> Sangjin
>
>
> On Mon, Feb 12, 2024 at 9:50 AM Steve Loughran  >
> wrote:
>
> > how can we work out the entire DAG of dependencies in a hadoop distro?
> >
> > I'm asking as there are things in 3.4.0 that we shouldn't need (protobuf
> > 2.5), and when I add the pR to move off log4j 1.17 to reload4j, I still
> > find one in the yarn timeline lib dir
> > https://github.com/apache/hadoop/pull/6547
> >
> >
> > see HADOOP-19074 for a list of what is in 3.4.0 RC0, which predates the
> new
> > shaded jar.
> >
>


[jira] [Resolved] (HADOOP-18930) S3A: make fs.s3a.create.performance an option you can set for the entire bucket

2024-02-13 Thread Steve Loughran (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-18930?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran resolved HADOOP-18930.
-
Fix Version/s: 3.4.0
   (was: 3.3.7-aws)
   Resolution: Fixed

> S3A: make fs.s3a.create.performance an option you can set for the entire 
> bucket
> ---
>
> Key: HADOOP-18930
> URL: https://issues.apache.org/jira/browse/HADOOP-18930
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: fs/s3
>Affects Versions: 3.3.9
>Reporter: Steve Loughran
>Assignee: Steve Loughran
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> make the fs.s3a.create.performance option something you can set everywhere, 
> rather than just in an openFile() option or under a magic path.
> this improves performance on apps like iceberg where filenames are generated 
> with UUIDs in them, so we know there are no overwrites



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86_64

2024-02-13 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-trunk-java8-linux-x86_64/1500/

No changes

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Apache Hadoop qbt Report: branch-2.10+JDK7 on Linux/x86_64

2024-02-13 Thread Apache Jenkins Server
For more details, see 
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/

No changes




-1 overall


The following subsystems voted -1:
asflicense hadolint mvnsite pathlen unit


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

Failed junit tests :

   hadoop.fs.TestFileUtil 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.hdfs.TestLeaseRecovery2 
   
hadoop.hdfs.server.blockmanagement.TestReplicationPolicyWithUpgradeDomain 
   hadoop.hdfs.TestFileLengthOnClusterRestart 
   hadoop.hdfs.TestDFSInotifyEventInputStream 
   hadoop.hdfs.server.namenode.snapshot.TestSnapshotBlocksMap 
   hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys 
   hadoop.hdfs.server.federation.router.TestRouterQuota 
   hadoop.hdfs.server.federation.router.TestRouterNamenodeHeartbeat 
   hadoop.hdfs.server.federation.resolver.order.TestLocalResolver 
   hadoop.hdfs.server.federation.resolver.TestMultipleDestinationResolver 
   hadoop.contrib.bkjournal.TestBookKeeperHACheckpoints 
   hadoop.mapreduce.lib.input.TestLineRecordReader 
   hadoop.mapred.TestLineRecordReader 
   hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter 
   hadoop.resourceestimator.service.TestResourceEstimatorService 
   hadoop.resourceestimator.solver.impl.TestLpSolver 
   hadoop.yarn.sls.TestSLSRunner 
   
hadoop.yarn.server.nodemanager.containermanager.linux.resources.TestNumaResourceAllocator
 
   
hadoop.yarn.server.nodemanager.containermanager.linux.resources.TestNumaResourceHandlerImpl
 
   hadoop.yarn.server.resourcemanager.TestClientRMService 
   hadoop.yarn.server.resourcemanager.recovery.TestFSRMStateStore 
   
hadoop.yarn.server.resourcemanager.monitor.invariants.TestMetricsInvariantChecker
 
  

   cc:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/diff-compile-cc-root.txt
  [4.0K]

   javac:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/diff-compile-javac-root.txt
  [488K]

   checkstyle:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/diff-checkstyle-root.txt
  [14M]

   hadolint:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   mvnsite:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-mvnsite-root.txt
  [572K]

   pathlen:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/diff-patch-pylint.txt
  [20K]

   shellcheck:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/diff-patch-shellcheck.txt
  [72K]

   whitespace:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/whitespace-eol.txt
  [12M]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/whitespace-tabs.txt
  [1.3M]

   javadoc:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-javadoc-root.txt
  [36K]

   unit:

   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt
  [220K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [452K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-rbf.txt
  [36K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs_src_contrib_bkjournal.txt
  [16K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-core.txt
  [104K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-unit-hadoop-tools_hadoop-azure.txt
  [20K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-unit-hadoop-tools_hadoop-resourceestimator.txt
  [16K]
   
https://ci-hadoop.apache.org/job/hadoop-qbt-branch-2.10-java7-linux-x86_64/1301/artifact/out/patch-unit-hadoop-tools_hadoop-sls.txt
  [28K]