Hi,

I have created a fat / shaded library jar to use in Spark via
SparkExtensions. The usage is done through setting spark.sql.extensions
conf to my class that extends `SparkSessionExtensionsProvider` within my
jar. The purpose of this extension jar is to inject my custom UDFs
functions (see: https://github.com/apache/spark/pull/22576).

The problem I am having is that I am trying to have my UDF Expression class
extend Spark Logging trait (org.apache.spark.internal.Logging). I was able
to compile the jar and load my extension fine but when I try to use my UDF
in Spark, I got the following error:

java.lang.AbstractMethodError: Method
com/company/bdp/expressions/NfHMAC.org$apache$spark$internal$Logging$$log__$eq(Lorg/slf4j/Logger;)V
is abstract
at com.company.bdp.expressions.NfHMAC.org
$apache$spark$internal$Logging$$log__$eq(EncryptionExpressions.scala)
at org.apache.spark.internal.Logging.$init$(Logging.scala:43)
at com.company.bdp.expressions.NfHMAC.<init>(EncryptionExpressions.scala:47)



The problem seems to be that no logging framework is bound to slf4j when
classes in my extension try to log. Although, log4j is included in Spark
classpath itself. I have also tried to exclude all logging stuff from my
fat jar to make sure that there is no conflict when Spark loads my
extension. I have tried to include a logging framework in my fat jar (i.e.
logback-classic) and then logging works fine when I run my UDF function. I
am not exactly sure on how SparkExtensions are loaded so I might be missing
something. Is it expected that a library to be loaded via SparkExtension
needs to include its own logging framework?


Thanks,

Maytas

Reply via email to