Question: Spark use log4j 1.2.17, if my application jar contains log4j 2.x and gets submitted to the Spark cluster. Which version of log4j gets actually used during the Spark session? ________________________________ From: Sean Owen <sro...@gmail.com> Sent: Monday, December 13, 2021 8:25 AM To: Jörn Franke <jornfra...@gmail.com> Cc: Pralabh Kumar <pralabhku...@gmail.com>; dev <d...@spark.apache.org>; user.spark <user@spark.apache.org> Subject: Re: Log4j 1.2.17 spark CVE
This has come up several times over years - search JIRA. The very short summary is: Spark does not use log4j 1.x, but its dependencies do, and that's the issue. Anyone that can successfully complete the surgery at this point is welcome to, but I failed ~2 years ago. On Mon, Dec 13, 2021 at 10:02 AM Jörn Franke <jornfra...@gmail.com<mailto:jornfra...@gmail.com>> wrote: Is it in any case appropriate to use log4j 1.x which is not maintained anymore and has other security vulnerabilities which won’t be fixed anymore ? Am 13.12.2021 um 06:06 schrieb Sean Owen <sro...@gmail.com<mailto:sro...@gmail.com>>: Check the CVE - the log4j vulnerability appears to affect log4j 2, not 1.x. There was mention that it could affect 1.x when used with JNDI or SMS handlers, but Spark does neither. (unless anyone can think of something I'm missing, but never heard or seen that come up at all in 7 years in Spark) The big issue would be applications that themselves configure log4j 2.x, but that's not a Spark issue per se. On Sun, Dec 12, 2021 at 10:46 PM Pralabh Kumar <pralabhku...@gmail.com<mailto:pralabhku...@gmail.com>> wrote: Hi developers, users Spark is built using log4j 1.2.17 . Is there a plan to upgrade based on recent CVE detected ? Regards Pralabh kumar