[ https://issues.apache.org/jira/browse/SPARK-37751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Shipeng Feng updated SPARK-37751: --------------------------------- Priority: Blocker (was: Major) > Apache Commons Crypto doesn't support Java 11 > --------------------------------------------- > > Key: SPARK-37751 > URL: https://issues.apache.org/jira/browse/SPARK-37751 > Project: Spark > Issue Type: Bug > Components: Security > Affects Versions: 3.1.2, 3.2.0 > Environment: Spark 3.2.0 on kubernetes > Reporter: Shipeng Feng > Priority: Blocker > > For kubernetes, we are using Java 11 in docker, > [https://github.com/apache/spark/blob/v3.2.0/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile:] > {code:java} > ARG java_image_tag=11-jre-slim > {code} > We have a simple app: > {code:scala} > object SimpleApp { > def main(args: Array[String]) { > val session = SparkSession.builder.getOrCreate > > // the size of demo.csv is 5GB > val rdd = session.read.option("header", "true").option("inferSchema", > "true").csv("/data/demo.csv").rdd > val lines = rdd.repartition(200) > val count = lines.count() > } > } > {code} > > Enable AES-based encryption for RPC connection by the following config: > {code:java} > --conf spark.authenticate=true > --conf spark.network.crypto.enabled=true > {code} > This would cause the following error: > {code:java} > java.lang.IllegalArgumentException: Frame length should be positive: > -6119185687804983867 > at > org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119) > at > org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:150) > at > org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) > at > io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) > at > org.apache.spark.network.crypto.TransportCipher$DecryptionHandler.channelRead(TransportCipher.java:190) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) > at > io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) > at > io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) > at > io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) > at > io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) > at > io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:719) > at > io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655) > at > io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) > at > io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986) > at > io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) > at > io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) > at java.base/java.lang.Thread.run(Unknown Source) {code} > The error disappears in 8-jre-slim. It seems that Apache Commons Crypto 1.1.0 > only works with Java 8: > [https://commons.apache.org/proper/commons-crypto/download_crypto.cgi] -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org