Hello,

Thank you for setting us on the right path.
We will use the 3.3.5 progress in our scanning.

We are now familiarizing ourselves with the codebase and the encountered CVEs.

Michiel de Jong

email signature

 * Winner of Dutch Innovation award within Law Enforcement
 * Active in 30+ countries

        Michiel de Jong
Software Engineer

site:
PGP:    web-iq.com <https://web-iq.com>
5E01 D729 326D F933 4A20 C8CF 7D09 6113 7CFD 29DA

The content of this email is confidential and intended for the recipient specified in message only. It is strictly forbidden to share any part of this message with any third party, without a written consent of the sender. If you received this message by mistake, please reply to this message and follow with its deletion, so that we can ensure such a mistake does not occur in the future.

On 14-03-2023 23:03, Steve Loughran wrote:

hello.

welcome to the hadoop CVE support team!

all this stuff happens on apache JIRA; the search term is
project in (HADOOP, YARN, HDFS, MAPREDUCE) AND text ~ cve ORDER BY created DESC

And we are cutting the 3.3.5 RC3 today; I just need to do the preflight checks before sending the emails.
in the hadoop github repo, branch-3.3.5 is the one this is built off

please use that for your audits, not 3.3.4, and when the RC goes up, do as much regression testing as you can

On Tue, 14 Mar 2023 at 08:27, Michiel de Jong <michieldej...@web-iq.nl> wrote:

    Hello Hadoop Developers,

    When running a dependency cve scan on our project we noticed a
    list of dependencies in hadoop common that have some CVE. There
    are also several CVEs listed on https://mvnrepository.the
    com/artifact/org.apache.hadoop/hadoop-common/3.3.4
    <https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common/3.3.4>.
    Many of these CVEs would probably not affect end users, however
    this is often difficult to determine for the end users themselves.

    Is there a procedure in place for handling reported CVEs? Is there
    a place where the CVEs that do not impact end users are documented?

    We would like to work on reducing the number of CVEs encountered
    in dependencies and document the CVEs that are not easily resolved
    and don't impact the end users.


where we are behind isĀ  the javascript stuff -that's the YARN project; i think it is undermaintained.

we also have to deal with the challenge of compatibility, especially a few applications away. for example. this pair of commits reflects how an upgrade broke hive/tez downstream.

HADOOP-18178. Upgrade jackson to 2.13.2 and jackson-databind to 2.13.2.2
HADOOP-18332. Remove rs-api dependency by downgrading jackson to 2.12.7

There's also the problem where libraries which generate classes (avro, parquet) are brittle to library updates...if we update them then applications which generated classes using the same lib just won't link any more. There we are resorting to the hadoop shaded jar and trying to cut the originals. though that adds more homework: keeping the shaded stuff current given its extra homework with more release overhead

hadoop 3.3.5 isn't doing this, I'd be happy for someone to take up and complete two PRs

HADOOP-18487. protobuf 2.5.0 marked as provided.
https://github.com/apache/hadoop/pull/4996

HADOOP-18197. Upgrade protobuf to 3.21.7
https://github.com/apache/hadoop-thirdparty/pull/19

This and any other CVE work can target the next release.

I am not going to hold back the 3.3.5 release for any more CVEs...we do that and the following week's CVEs become blockers instead. HDFS critical issues are the last bits of trouble.

steve




Attachment: OpenPGP_0x7D0961137CFD29DA_and_old_rev.asc
Description: OpenPGP public key

Attachment: OpenPGP_signature
Description: OpenPGP digital signature

Reply via email to