had a report of that staging playing up too from Gautham

I'm going to have to treat this as a failure from the maven deploy process,
so I'll have to cancel this RC and try and work out WTF went wrong.

I'll also put together some minimal apps to fetch and compile against
everything. i did do some myself (storediag, hboss), and did try to make
sure I had cleaned up my local repo first, but maybe it was tainted.

oh well

-------

However, I get this error saying that it can’t find the maven plugins and
jars for Hadoop 3.3.3 –

[INFO] ------------------< org.apache.hadoop:hadoop-common
>-------------------

[INFO] Building Apache Hadoop Common 3.3.3
[5/9]

[INFO] --------------------------------[ jar
]---------------------------------

Downloading from central:
https://repository.apache.org/content/repositories/orgapachehadoop-1348/org/apache/hadoop/hadoop-maven-plugins/3.3.3/hadoop-maven-plugins-3.3.3.pom

[WARNING] The POM for org.apache.hadoop:hadoop-maven-plugins:jar:3.3.3 is
missing, no dependency information available

Downloading from central:
https://repository.apache.org/content/repositories/orgapachehadoop-1348/org/apache/hadoop/hadoop-maven-plugins/3.3.3/hadoop-maven-plugins-3.3.3.jar

[INFO]
------------------------------------------------------------------------

[INFO] Reactor Summary for Apache Hadoop Common Project 3.3.3:

[INFO]

[INFO] Apache Hadoop Annotations .......................... SUCCESS [
7.349 s]

[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [
1.810 s]

[INFO] Apache Hadoop Auth ................................. SUCCESS [
7.922 s]

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [
0.375 s]

[INFO] Apache Hadoop Common ............................... FAILURE [
2.095 s]

[INFO] Apache Hadoop NFS .................................. SKIPPED

[INFO] Apache Hadoop KMS .................................. SKIPPED

[INFO] Apache Hadoop Registry ............................. SKIPPED

[INFO] Apache Hadoop Common Project ....................... SKIPPED

[INFO]
------------------------------------------------------------------------

[INFO] BUILD FAILURE

[INFO]
------------------------------------------------------------------------

[INFO] Total time:  24.946 s

[INFO] Finished at: 2022-05-07T11:29:18+05:30

[INFO]
------------------------------------------------------------------------

[ERROR] Plugin org.apache.hadoop:hadoop-maven-plugins:3.3.3 or one of its
dependencies could not be resolved: Could not find artifact
org.apache.hadoop:hadoop-maven-plugins:jar:3.3.3 in central (
https://repository.apache.org/content/repositories/orgapachehadoop-1348/)
-> [Help 1]

[ERROR]



Could you please let me know how I can consume the Hadoop 3 jars in maven?



Thanks,

--Gautham

On Sat, 7 May 2022 at 22:45, Akira Ajisaka <aajis...@apache.org> wrote:

> Hi Chao,
>
> How about using
> https://repository.apache.org/content/repositories/orgapachehadoop-1348/
> instead of https://repository.apache.org/content/repositories/staging/ ?
>
> Akira
>
> On Sat, May 7, 2022 at 10:52 AM Ayush Saxena <ayush...@gmail.com> wrote:
>
> > Hmm, I see the artifacts ideally should have got overwritten by the new
> > RC, but they didn’t. The reason seems like the staging path shared
> doesn’t
> > have any jars…
> > That is why it was picking the old jars. I think Steve needs to run mvn
> > deploy again…
> >
> > Sent from my iPhone
> >
> > > On 07-May-2022, at 7:12 AM, Chao Sun <sunc...@apache.org> wrote:
> > >
> > > 
> > >>
> > >> Chao can you use the one that Steve mentioned in the mail?
> > >
> > > Hmm how do I do that? Typically after closing the RC in nexus the
> > > release bits will show up in
> > >
> >
> https://repository.apache.org/content/repositories/staging/org/apache/hadoop
> > > and Spark build will be able to pick them up for testing. However in
> > > this case I don't see any 3.3.3 jars in the URL.
> > >
> > >> On Fri, May 6, 2022 at 6:24 PM Ayush Saxena <ayush...@gmail.com>
> wrote:
> > >>
> > >> There were two 3.3.3 staged. The earlier one was with skipShade, the
> > date was also april 22, I archived that. Chao can you use the one that
> > Steve mentioned in the mail?
> > >>
> > >>> On Sat, 7 May 2022 at 06:18, Chao Sun <sunc...@apache.org> wrote:
> > >>>
> > >>> Seems there are some issues with the shaded client as I was not able
> > >>> to compile Apache Spark with the RC
> > >>> (https://github.com/apache/spark/pull/36474). Looks like it's
> compiled
> > >>> with the `-DskipShade` option and the hadoop-client-api JAR doesn't
> > >>> contain any class:
> > >>>
> > >>> ➜  hadoop-client-api jar tf 3.3.3/hadoop-client-api-3.3.3.jar
> > >>> META-INF/
> > >>> META-INF/MANIFEST.MF
> > >>> META-INF/NOTICE.txt
> > >>> META-INF/LICENSE.txt
> > >>> META-INF/maven/
> > >>> META-INF/maven/org.apache.hadoop/
> > >>> META-INF/maven/org.apache.hadoop/hadoop-client-api/
> > >>> META-INF/maven/org.apache.hadoop/hadoop-client-api/pom.xml
> > >>> META-INF/maven/org.apache.hadoop/hadoop-client-api/pom.properties
> > >>>
> > >>> On Fri, May 6, 2022 at 4:24 PM Stack <st...@duboce.net> wrote:
> > >>>>
> > >>>> +1 (binding)
> > >>>>
> > >>>>  * Signature: ok
> > >>>>  * Checksum : passed
> > >>>>  * Rat check (1.8.0_191): passed
> > >>>>   - mvn clean apache-rat:check
> > >>>>  * Built from source (1.8.0_191): failed
> > >>>>   - mvn clean install  -DskipTests
> > >>>>   - mvn -fae --no-transfer-progress -DskipTests
> > -Dmaven.javadoc.skip=true
> > >>>> -Pnative -Drequire.openssl -Drequire.snappy -Drequire.valgrind
> > >>>> -Drequire.zstd -Drequire.test.libhadoop clean install
> > >>>>  * Unit tests pass (1.8.0_191):
> > >>>>    - HDFS Tests passed (Didn't run more than this).
> > >>>>
> > >>>> Deployed a ten node ha hdfs cluster with three namenodes and five
> > >>>> journalnodes. Ran a ten node hbase (older version of 2.5 branch
> built
> > >>>> against 3.3.2) against it. Tried a small verification job. Good.
> Ran a
> > >>>> bigger job with mild chaos. All seems to be working properly
> > (recoveries,
> > >>>> logs look fine). Killed a namenode. Failover worked promptly. UIs
> look
> > >>>> good. Poked at the hdfs cli. Seems good.
> > >>>>
> > >>>> S
> > >>>>
> > >>>> On Tue, May 3, 2022 at 4:24 AM Steve Loughran
> > <ste...@cloudera.com.invalid>
> > >>>> wrote:
> > >>>>
> > >>>>> I have put together a release candidate (rc0) for Hadoop 3.3.3
> > >>>>>
> > >>>>> The RC is available at:
> > >>>>> https://dist.apache.org/repos/dist/dev/hadoop/3.3.3-RC0/
> > >>>>>
> > >>>>> The git tag is release-3.3.3-RC0, commit d37586cbda3
> > >>>>>
> > >>>>> The maven artifacts are staged at
> > >>>>>
> > https://repository.apache.org/content/repositories/orgapachehadoop-1348/
> > >>>>>
> > >>>>> You can find my public key at:
> > >>>>> https://dist.apache.org/repos/dist/release/hadoop/common/KEYS
> > >>>>>
> > >>>>> Change log
> > >>>>>
> https://dist.apache.org/repos/dist/dev/hadoop/3.3.3-RC0/CHANGELOG.md
> > >>>>>
> > >>>>> Release notes
> > >>>>>
> > https://dist.apache.org/repos/dist/dev/hadoop/3.3.3-RC0/RELEASENOTES.md
> > >>>>>
> > >>>>> There's a very small number of changes, primarily critical
> > code/packaging
> > >>>>> issues and security fixes.
> > >>>>>
> > >>>>>
> > >>>>>   - The critical fixes which shipped in the 3.2.3 release.
> > >>>>>   -  CVEs in our code and dependencies
> > >>>>>   - Shaded client packaging issues.
> > >>>>>   - A switch from log4j to reload4j
> > >>>>>
> > >>>>>
> > >>>>> reload4j is an active fork of the log4j 1.17 library with the
> > classes which
> > >>>>> contain CVEs removed. Even though hadoop never used those classes,
> > they
> > >>>>> regularly raised alerts on security scans and concen from users.
> > Switching
> > >>>>> to the forked project allows us to ship a secure logging framework.
> > It will
> > >>>>> complicate the builds of downstream maven/ivy/gradle projects which
> > exclude
> > >>>>> our log4j artifacts, as they need to cut the new dependency
> > instead/as
> > >>>>> well.
> > >>>>>
> > >>>>> See the release notes for details.
> > >>>>>
> > >>>>> This is my first release through the new docker build process, do
> > please
> > >>>>> validate artifact signing &c to make sure it is good. I'll be
> trying
> > builds
> > >>>>> of downstream projects.
> > >>>>>
> > >>>>> We know there are some outstanding issues with at least one library
> > we are
> > >>>>> shipping (okhttp), but I don't want to hold this release up for it.
> > If the
> > >>>>> docker based release process works smoothly enough we can do a
> > followup
> > >>>>> security release in a few weeks.
> > >>>>>
> > >>>>> Please try the release and vote. The vote will run for 5 days.
> > >>>>>
> > >>>>> -Steve
> > >>>>>
> > >>>
> > >>> ---------------------------------------------------------------------
> > >>> To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
> > >>> For additional commands, e-mail: common-dev-h...@hadoop.apache.org
> > >>>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: yarn-dev-unsubscr...@hadoop.apache.org
> > For additional commands, e-mail: yarn-dev-h...@hadoop.apache.org
> >
> >
>

Reply via email to