This is an automated email from the ASF dual-hosted git repository.

stevel pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/hadoop-release-support.git

commit ce0fb49fe076c9d2898a4a9cd8300dc5b4148561
Author: Steve Loughran <ste...@cloudera.com>
AuthorDate: Thu Dec 1 19:11:07 2022 +0000

    HADOOP-18470. releasing hadoop 3.3.5
---
 README.md                                          | 62 +++++++++-------------
 pom.xml                                            |  8 +--
 .../github/steveloughran/validator/CompileFS.java  |  2 +-
 .../steveloughran/validator/TestRuntimeValid.java  |  1 -
 src/text/email.txt                                 | 24 +++++++--
 5 files changed, 50 insertions(+), 47 deletions(-)

diff --git a/README.md b/README.md
index 4a3f73a..35ceb05 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,5 @@
 # Validate Hadoop Release Artifacts
-l
+
 This project helps validate hadoop release candidates
 
 It has an ant `build.xml` file to help with preparing the release,
@@ -9,8 +9,6 @@ validating gpg signatures, creating release messages and other 
things.
 
 see below
 
-
-
 # maven builds
 
 To build and test with the client API:
@@ -66,7 +64,6 @@ rc=0
 
 ### Clean up first
 
-
 ```bash
 ant clean
 ```
@@ -86,17 +83,14 @@ This will take a while! look in target/incoming for progress
 ant scp-artifacts
 ```
 
-
 ### Move to the release dir
 
-
 ```bash
 ant move-scp-artifacts release.dir.check
 ```
 
 ### verify gpg signing
 
-
 ```bash
 ant gpg.keys gpg.verify
 ```
@@ -109,6 +103,10 @@ https://svn.apache.org URL.
 ```bash
 ant stage
 ```
+
+This makes it visible to others via the apache svn site, but it
+is not mirrored yet.
+
 When the RC is released, an `svn move` operation can promote it
 directly.
 
@@ -136,7 +134,6 @@ ant print-tag-command
 2. Find the hadoop repo for the RC
 3. "close" it and wait for that to go through
 
-
 ### Generate the RC vote email
 
 Review/update template message in `src/email.txt`.
@@ -160,25 +157,24 @@ locally.
 In build properties, declare `hadoop.version`, `rc` and `http.source`
 
 ```properties
-hadoop.version=3.3.4
+hadoop.version=3.3.5
 rc=1
-http.source=https://dist.apache.org/repos/dist/dev/hadoop/hadoop-3.3.4-RC1/
+http.source=https://dist.apache.org/repos/dist/dev/hadoop/hadoop-${hadoop.version}-RC${rc}/
 ```
 
 targets of relevance
 
 | target             | action                     |
 |--------------------|----------------------------|
-| release.fetch.http | fetch artifacts            |
-| release.dir.check  | verify release dir exists  |
-| release.src.untar  | untar retrieved artifacts  |
-| release.src.build  | build the source           |
-| release.src.test   | build and test the source  |
-| gpg.keys           | import the hadoop KEYS     |
-| gpg.verify         | verify the D/L'd artifacts |
+| `release.fetch.http` | fetch artifacts            |
+| `release.dir.check`  | verify release dir exists  |
+| `release.src.untar`  | untar retrieved artifacts  |
+| `release.src.build`  | build the source           |
+| `release.src.test`   | build and test the source  |
+| `gpg.keys`           | import the hadoop KEYS     |
+| `gpg.verify `        | verify the D/L'd artifacts |
 |                    |                            |
 
-
 set `release.native.binaries` to false to skip native binary checks on 
platforms without them
 
 ### Download the RC files from the http server
@@ -197,14 +193,14 @@ do not do this while building/testing downstream projects
 ant release.src.untar release.src.build
 ```
 
-
 # Building and testing projects from the staged maven artifacts
 
 A lot of the targets build maven projects from the staged maven artifacts.
 
 For this to work
+
 1. check out the relevant projects somewhere
-2. set their location in the build.properties file
+2. set their location in the `build.properties` file
 3. make sure that the branch checked out is the one you want to build.
    This matters for anyone who works on those other projects
    on their own branches.
@@ -212,7 +208,6 @@ For this to work
 
 First, purge your maven repo
 
-    
 ```bash
 ant purge-from-maven
 ```
@@ -228,9 +223,8 @@ ant cloudstore.build
 ## Google GCS
 
 This is java11 only.
- 
-Ideally, you should run the tests, or even better, run them before the RC is 
up for review.
 
+Ideally, you should run the tests, or even better, run them before the RC is 
up for review.
 
 Building the libraries.
 Do this only if you aren't running the tests.
@@ -239,23 +233,16 @@ Do this only if you aren't running the tests.
 ant gcs.build
 ```
 
-Testing the libraries
-```
-ant gcs.build
-```
-
-
-
 ## Apache Spark
 
 Validates hadoop client artifacts; the cloud tests cover hadoop cloud storage 
clients.
 
-
 ```bash
 ant spark.build
 ```
 
 Then followup cloud examples if you are set up
+
 ```bash
 ant cloud-examples.build
 ant cloud-examples.test
@@ -263,7 +250,6 @@ ant cloud-examples.test
 
 ## HBase filesystem
 
-
 ```bash
 ant hboss.build
 ```
@@ -274,8 +260,8 @@ set `hadoop.site.dir` to be the path to where the git
 clone of the asf site repo is
 
 ```properties
-hadoop.site.dir=/Users/stevel/hadoop/release/hadoop-site\
-  ```
+hadoop.site.dir=/Users/stevel/hadoop/release/hadoop-site
+```
 
 prepare the site with the following targets
 
@@ -286,13 +272,13 @@ ant release.site.docs
 
 review the annoucement.
 
-In the 
+In the hadoop site dir
 
 ```bash
 rm current3
-ln -s r.3.3.4 current3
+ln -s r.3.3.5 current3
 ls -l
 rm stable3
-ln -s r3.3.4 stable
-ln -s r3.3.4 stable3
+ln -s r3.3.5 stable
+ln -s r3.3.5 stable3
 ```
diff --git a/pom.xml b/pom.xml
index e65f3f0..6af88d3 100644
--- a/pom.xml
+++ b/pom.xml
@@ -36,7 +36,7 @@
     <maven-antrun-plugin.version>1.7</maven-antrun-plugin.version>
 
 
-    <hadoop.version>3.3.4</hadoop.version>
+    <hadoop.version>3.3.5</hadoop.version>
 
     <!-- SLF4J/LOG4J version -->
     <slf4j.version>1.7.36</slf4j.version>
@@ -151,9 +151,9 @@
       </repositories>
     </profile>
     <profile>
-      <id>hadoop-3.3.4</id>
+      <id>hadoop-3.3.5</id>
       <properties>
-        <hadoop.version>3.3.4</hadoop.version>
+        <hadoop.version>3.3.5</hadoop.version>
       </properties>
     </profile>
 
@@ -167,7 +167,7 @@
     <profile>
       <id>branch-3.3</id>
       <properties>
-        <hadoop.version>3.3.4-SNAPSHOT</hadoop.version>
+        <hadoop.version>3.3.9-SNAPSHOT</hadoop.version>
       </properties>
     </profile>
 
diff --git a/src/main/java/com/github/steveloughran/validator/CompileFS.java 
b/src/main/java/com/github/steveloughran/validator/CompileFS.java
index a5bf59a..c77f164 100644
--- a/src/main/java/com/github/steveloughran/validator/CompileFS.java
+++ b/src/main/java/com/github/steveloughran/validator/CompileFS.java
@@ -35,7 +35,7 @@ public class CompileFS {
 
   public FileSystem run() throws IOException {
     final FileSystem fs = FileSystem.getLocal(new Configuration());
-    LOG.info("fs is {)", fs);
+    LOG.info("fs is {}", fs);
     return fs;
   }
   public static void main(String[] args) throws Exception {
diff --git 
a/src/test/java/com/github/steveloughran/validator/TestRuntimeValid.java 
b/src/test/java/com/github/steveloughran/validator/TestRuntimeValid.java
index aeb3cd4..751e500 100644
--- a/src/test/java/com/github/steveloughran/validator/TestRuntimeValid.java
+++ b/src/test/java/com/github/steveloughran/validator/TestRuntimeValid.java
@@ -29,6 +29,5 @@ public class TestRuntimeValid {
   public void testRuntime() throws Throwable {
     final CompileFS compileFS = new CompileFS();
     compileFS.run();
-
   }
 }
diff --git a/src/text/email.txt b/src/text/email.txt
index c6bf912..907e2c8 100644
--- a/src/text/email.txt
+++ b/src/text/email.txt
@@ -1,6 +1,13 @@
 [VOTE] Release Apache Hadoop ${hadoop.version}
+Smoke test release release Apache Hadoop ${hadoop.version}
 
-I have put together a release candidate (${rc}) for Hadoop ${hadoop.version}
+Mukund and I have put together a release candidate (${rc}) for Hadoop 
${hadoop.version}.
+
+This isn't quite ready for a vote as we know there are a couple of fixes to go 
in
+(one for abfs, one for hadoop-hdfs-nfs).
+
+What we would like is for anyone who can to verify the tarballs, especially
+anyone who can try the arm64 binaries as we want to include them too.
 
 The RC is available at:
 https://dist.apache.org/repos/dist/dev/hadoop/${rc-dirname}/
@@ -13,15 +20,26 @@ ${nexus.staging.url}
 You can find my public key at:
 https://dist.apache.org/repos/dist/release/hadoop/common/KEYS
 
+Mukund doesn't have one as gpg refuses to work on his laptop.
+
 Change log
 https://dist.apache.org/repos/dist/dev/hadoop/${rc-dirname}/CHANGELOG.md
 
 Release notes
 https://dist.apache.org/repos/dist/dev/hadoop/${rc-dirname}/RELEASENOTES.md
 
-There's a very small number of changes, primarily critical code/packaging
-issues and security fixes.
+This is off branch-3.3 and is the first big release since 3.3.2.
 
 See the release notes for details.
 
+Key changes
+
+* Big update of dependencies to try and keep those reports of
+  transitive CVEs under control.
+* Vectored IO API for all FSDataInputStream implementations, with
+  high-performance versions for file:// and s3a:// filesystems.
+  file:// through java native io
+  s3a:// parallel GET requests.
+
+
 Please try the release and vote. The vote will run for 5 days.


---------------------------------------------------------------------
To unsubscribe, e-mail: common-commits-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-commits-h...@hadoop.apache.org

Reply via email to