log4j 1.2.17 is not vulnerable. There is an existing CVE there from a log
aggregation servlet; Cloudera products ship a patched release with that
servlet stripped...asf projects are not allowed to do that.

But: some recent Cloudera Products do include log4j 2.x, so colleagues of
mine are busy patching and retesting everything. If anyone replaces the
vulnerable jars themselves, remember to look in spark.tar.gz on hdfs to
make sure it is safe.


hadoop stayed on log4j 1.2.17 because 2.x
* would have broken all cluster management tools which configured
log4j.properties files
* wouldn't let us use System properties to can I figure logging... That is
really useful when you want to run a job with debug logging
* didn't support the no capture we use in mockito and functional tests

But: the SLF4J it's used throughout; spark doesn't need to be held back by
that choice and can use any backend you want

I don't know what we will do now; akira has just suggested logback
https://issues.apache.org/jira/browse/HADOOP-12956

had I not just broken a collar bone and so unable to code, I would have
added a new command to audit the the hadoop class path to verify it wasn't
vulnerable. Someone could do the same for spark -where you would want an
RDD where the probe would also take place in worker tasks to validate the
the cluster safety more broadly, including the tarball.

meanwhile, if your product is not exposed -probably worth mentioning on the
users mailing list so as to help people focus their attention. It's
probably best to work with everyone who produces spark based Products so
that you can have a single summary.

On Tue, 14 Dec 2021 at 01:31, Qian Sun <qian.sun2...@gmail.com> wrote:

> My understanding is that we don’t need to do anything. Log4j2-core not
> used in spark.
>
> > 2021年12月13日 下午12:45,Pralabh Kumar <pralabhku...@gmail.com> 写道:
> >
> > Hi developers,  users
> >
> > Spark is built using log4j 1.2.17 . Is there a plan to upgrade based on
> recent CVE detected ?
> >
> >
> > Regards
> > Pralabh kumar
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to