Having played with myself today, I'm going to say -1

the hadoop-azure, hadoop-aws &c jars didn't end up in
share/hadoop/common/lib where they are meant to

what I Expected (from a snapshot build locally
 ls  hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-*
hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-annotations-3.4.3-SNAPSHOT.jar

hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-cos-3.4.3-SNAPSHOT.jar
hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-auth-3.4.3-SNAPSHOT.jar

 
hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-huaweicloud-3.4.3-SNAPSHOT.jar
hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-aws-3.4.3-SNAPSHOT.jar

hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-shaded-guava-1.5.0.jar
hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-azure-3.4.3-SNAPSHOT.jar

hadoop-3.4.3-SNAPSHOT/share/hadoop/common/lib/hadoop-shaded-protobuf_3_25-1.5.0.jar

what I found

ls  hadoop-3.4.3/share/hadoop/common/lib/hadoop-*
hadoop-3.4.3/share/hadoop/common/lib/hadoop-annotations-3.4.3.jar
hadoop-3.4.3/share/hadoop/common/lib/hadoop-shaded-guava-1.5.0.jar
hadoop-3.4.3/share/hadoop/common/lib/hadoop-auth-3.4.3.jar
 hadoop-3.4.3/share/hadoop/common/lib/hadoop-shaded-protobuf_3_25-1.5.0.jar

this is the same in the arm and the x86.

The RC moves the cloud artifacts into the common/lib dir so are all on the
classpath, but they didn't get in. Something is probably up with the
execution of the hadoop-cloud-storage-dist , with it being excluded.

Will research tonight.

Thanks for everyone who put in the effort qualifying, sorry it was wasted.

Steve

On Mon, 2 Feb 2026 at 16:34, Dongjoon Hyun <[email protected]> wrote:

> +1
>
> I tested with the Apache ORC project.
>
> Thank you!
>
> Dongjoon.
>
> On 2026/02/02 06:51:46 Xiaoqiao He wrote:
> > +1(binding)
> >
> > [Y] LICENSE files exist and NOTICE is included.
> > [Y] Rat check is ok. `mvn clean apache-rat:check`.
> > [Y] Build the source code. `mvn clean package -DskipTests -Pnative -Pdist
> > -Dtar`.
> > [Y] Setup pseudo cluster with HDFS and YARN.
> > [Y] Run simple `FsShell - mkdir/put/get/mv/rm` and check the result.
> > [Y] Run example mr jobs and check the result.
> > [Y] Spot-check and run some unit tests.
> > [Y] Skimmed the Web UI of NameNode/DataNode/Resourcemanager/NodeManager.
> > [Y] Skimmed over the contents of site documentation.
> >
> > Best Regards,
> > - He Xiaoqiao
> >
> >
> > On Sun, Feb 1, 2026 at 4:57 PM Ayush Saxena <[email protected]> wrote:
> >
> > > +1 (Binding)
> > >
> > > * Built from source
> > > * Verified Checksums
> > > * Verified Signatures
> > > * Verified NOTICE & LICENSE files
> > > * Verified the output of `hadoop version`
> > > * Ran some basic HDFS shell commands
> > > * Ran example Jobs (WordCount, TeraGen, TeraSort & TeraValidate)
> > > * Browsed through the UI (NN, DN, RM, NM, JHS)
> > >
> > > Thanx Steve for driving the release. Good Luck!!!
> > >
> > > -Ayush
> > >
> > > On Sat, 31 Jan 2026 at 05:54, Chris Nauroth <[email protected]>
> wrote:
> > >
> > > > +1
> > > >
> > > > Thank you for the RC, Steve.
> > > >
> > > > * Verified all checksums.
> > > > * Verified all signatures.
> > > > * Built from source, including native code on Linux.
> > > >     * mvn clean package -Pnative -Psrc -Drequire.openssl
> -Drequire.snappy
> > > > -Drequire.zstd -DskipTests
> > > > * Tests passed.
> > > >     * mvn --fail-never clean test -Pnative -Dparallel-tests
> > > > -Drequire.snappy -Drequire.zstd -Drequire.openssl
> > > > -Dsurefire.rerunFailingTestsCount=3 -DtestsThreadCount=8
> > > > * For ARM verification:
> > > >     * Ran "file <X>" on all native binaries in the ARM tarball to
> confirm
> > > > they actually came out with ARM as the architecture.
> > > >     * Output of hadoop checknative -a on ARM looks good.
> > > >     * Ran a MapReduce job with the native bzip2 codec for
> compression,
> > > and
> > > > it worked fine.
> > > >
> > > > Chris Nauroth
> > > >
> > > >
> > > > On Thu, Jan 29, 2026 at 6:27 AM Cameron Scholes
> <[email protected]
> > > >
> > > > wrote:
> > > >
> > > > > +1 (non-binding)
> > > > >
> > > > > On 29/01/2026 07:02, Cheng Pan wrote:
> > > > > > +1 (non-binding)
> > > > > >
> > > > > > Have integrated with Spark, everything works fine with Java 17.
> > > > > >
> > > > > > Also tried run Spark tests with Java 25, no issues found related
> to
> > > > > Hadoop, so far.
> > > > > >
> > > > > > Thanks,
> > > > > > Cheng Pan
> > > > > >
> > > > > > On 2026/01/27 20:22:34 Steve Loughran wrote:
> > > > > >> Apache Hadoop 3.4.3
> > > > > >>
> > > > > >> I have put together a release candidate (RC0) for Hadoop 3.4.3.
> > > > > >>
> > > > > >> ThIs is a maintenance release of the branch-3.4 release, with
> > > upgrades
> > > > > of
> > > > > >> dependencies, security, critical bug fixes and abfs and s3a
> > > > > enhancements.
> > > > > >>
> > > > > >> Change log
> > > > > >>
> > > > >
> > > >
> > >
> https://dist.apache.org/repos/dist/dev/hadoop/hadoop-3.4.3-RC0/CHANGELOG.md
> > > > > >>
> > > > > >> Release notes
> > > > > >>
> > > > >
> > > >
> > >
> https://dist.apache.org/repos/dist/dev/hadoop/hadoop-3.4.3-RC0/RELEASENOTES.m
> > > > > >> d
> > > > > >>
> > > > > >>
> > > > > >> What I would like is for anyone who can to verify the tarballs,
> > > > > especially
> > > > > >> anyone who can try the arm64 binaries as we want to include them
> > > too.
> > > > > >>
> > > > > >> The RC is available at:
> > > > > >> https://dist.apache.org/repos/dist/dev/hadoop/hadoop-3.4.3-RC0/
> > > > > >>
> > > > > >> The git tag is release-3.4.3-RC0, commit 56b832dfd5
> > > > > >>
> > > > > >> The maven artifacts are staged at
> > > > > >>
> > > >
> https://repository.apache.org/content/repositories/orgapachehadoop-1461
> > > > > >>
> > > > > >> You can find my public key at:
> > > > > >> https://dist.apache.org/repos/dist/release/hadoop/common/KEYS
> > > > > >>
> > > > > >>
> > > > > >> *Build note*: the maven artifacts are off the aarch64 release,
> not
> > > the
> > > > > x86;
> > > > > >> single builds on ec2 VMs through our cloud infra kept resulting
> in
> > > > > multiple
> > > > > >> staging repos,
> > > > > >> probably a side effect of our VPN setup.
> > > > > >>
> > > > > >> A raspberry pi5 is perfectly adequate to cut a release, even
> with
> > > just
> > > > > an
> > > > > >> SD Card as the storage.
> > > > > >> I built the x86 release remotely, though as I have an 2016
> ubuntu
> > > > > laptop I
> > > > > >> could try there too.
> > > > > >>
> > > > > >>
> > > > > >> *AWS SDK*
> > > > > >>
> > > > > >> Previous releases included a "lean" tar without the AWS SDK,
> and/or
> > > > > >> encountered
> > > > > >> problems with the size of the .tar artifacts.
> > > > > >>
> > > > > >> Now all releases are built without the AWS SDK; it must be
> > > explicitly
> > > > > added
> > > > > >> to
> > > > > >> share/hadoop/common/lib/
> > > > > >>
> > > > > >> To add aws support to hadoop, download from Maven Central the
> > > version
> > > > of
> > > > > >> the SDK
> > > > > >> you wish to use:
> > > > > >>
> > > > > >>
> > > > >
> > > >
> > >
> https://central.sonatype.com/artifact/software.amazon.awssdk/bundle/versions
> > > > > >>
> > > > > >> For this release, the version to download is 2.35.4
> > > > > >>
> > > https://repo1.maven.org/maven2/software/amazon/awssdk/bundle/2.35.4/)
> > > > > >>
> > > > > >> 1. Download the bundle-2.35.4.jar artifact and check its
> signature
> > > > with
> > > > > >>     the accompanying bundle-2.35.4.jar.asc file.
> > > > > >>
> > > > > >> 2. Copy the JAR to share/hadoop/common/lib/
> > > > > >>
> > > > > >> Newer AWS SDK versions _should_ work, though regressions are
> almost
> > > > > >> inevitable.
> > > > > >>
> > > > > >> Please try the release and vote. The vote will run for 5 days.
> > > > > >>
> > > > > >> Steve
> > > > > >>
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> ---------------------------------------------------------------------
> > > > > > To unsubscribe, e-mail:[email protected]
> > > > > > For additional commands,
> e-mail:[email protected]
> > > > > >
> > > >
> > >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>

Reply via email to