Gobi2511 opened a new issue, #468:
URL: https://github.com/apache/incubator-livy/issues/468
I cannot able to connect to spark from livy. When making request to create a
session the session becomes dead with the following error.
**Versions:**
OS: macOS Sonoma 14.6.1
Python: 3.11.9
Spark: 3.5.4
Scala: 2.12
Livy: 0.8.0
**Error:**
`stderr:
25/02/11 20:01:19 WARN Utils: Your hostname, MacBook-Pro-7.local resolves to
a loopback address: 127.0.0.1; using 192.168.0.100 instead (on interface en0)
25/02/11 20:01:19 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
25/02/11 20:01:19 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
25/02/11 20:01:19 INFO RSCDriver: Connecting to: 192.168.0.100:10000
25/02/11 20:01:19 INFO RSCDriver: Starting RPC server...
25/02/11 20:01:20 INFO RpcServer: Connected to the port 10001
25/02/11 20:01:20 WARN ClientConf: Your hostname, MacBook-Pro-7.local,
resolves to a loopback address; using 192.168.0.100 instead (on interface en0)
25/02/11 20:01:20 WARN ClientConf: Set 'livy.rsc.rpc.server.address' if you
need to bind to another address.
Exception in thread "main" java.util.concurrent.ExecutionException:
javax.security.sasl.SaslException: Client closed before SASL negotiation
finished.
at io.netty.util.concurrent.DefaultPromise.get(DefaultPromise.java:351)
at
org.apache.livy.rsc.driver.RSCDriver.initializeServer(RSCDriver.java:203)
at org.apache.livy.rsc.driver.RSCDriver.run(RSCDriver.java:336)
at
org.apache.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:93)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:569)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1034)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: javax.security.sasl.SaslException: Client closed before SASL
negotiation finished.
at org.apache.livy.rsc.rpc.Rpc$SaslClientHandler.dispose(Rpc.java:592)
at
org.apache.livy.rsc.rpc.SaslHandler.channelInactive(SaslHandler.java:92)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:305)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:274)
at
io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:411)
at
io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:376)
at
io.netty.handler.codec.ByteToMessageCodec.channelInactive(ByteToMessageCodec.java:118)
at
org.apache.livy.rsc.rpc.KryoMessageCodec.channelInactive(KryoMessageCodec.java:97)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:303)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:274)
at
io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1405)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:301)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:281)
at
io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:901)
at
io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:813)
at
io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)
at
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)
at
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)
at
io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
at
io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at java.base/java.lang.Thread.run(Thread.java:840)
25/02/11 20:01:20 INFO ShutdownHookManager: Shutdown hook called
25/02/11 20:01:20 INFO ShutdownHookManager: Deleting directory
/private/var/folders/xc/s8y7cw_d4sld08rn88b4_7480000gp/T/spark-7f7a4d73-f5c0-45e1-9f44-bbfbceecaa1e`
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]