[ 
https://issues.apache.org/jira/browse/SPARK-52585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18009934#comment-18009934
 ] 

rami commented on SPARK-52585:
------------------------------

I observe the same behavior when running  batch jobs on Kubernetes BUT, 
strangely enough, only on executor pods NOT on the driver pod.

I can confirm that this is different than Spark 3.5.6 and can also provide more 
context if needed.

> Failed to load class "org.slf4j.impl.StaticLoggerBinder" when using hadoop-aws
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-52585
>                 URL: https://issues.apache.org/jira/browse/SPARK-52585
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 4.0.0
>         Environment: I am using PySpark 4.0.0 and python 3.11 in a RHEL 
> operating system. JDK version 21. I configured the spark.jars.packages 
> configuration to include {{org.apache.spark:spark-hadoop-cloud_2.13:4.0.0}}
>            Reporter: Laurens
>            Priority: Minor
>
> I am a PySpark user. Ever since upgrading to PySpark 4.0.0, the first time I 
> try to read from an S3 bucket via {{spark.read.parquet("s3a://")}} I receive 
> the following warning:
>  
> {noformat}
> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further 
> details.{noformat}
> This has not happened in version 3.5 or earlier.
> I understand this warning has no impact on data operations since it only 
> affects logging. However it is still a nuisance that might have to be looked 
> into.
> I am not familiar with Java at all, but I tried to do some slight debugging. 
> I set the flag {{SPARK_PRINT_LAUNCH_COMMAND }}and noticed that 
> slf4j-api-2.0.16.jar is inserted in the class path. However the link in the 
> warning clearly states this warning message should occur for versions of 
> SLF4J of 1.7.x and earlier.
> To enable reading from S3, I configured the spark.jars.packages configuration 
> to include {{org.apache.spark:spark-hadoop-cloud_2.13:4.0.0.}} My suspicion 
> is that one of these dependencies has an older version of SLF4J embedded in 
> it.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to