Venkata Sai Akhil Gudesa created SPARK-44293:
------------------------------------------------

             Summary: Task failures during custom JAR fetch in executors
                 Key: SPARK-44293
                 URL: https://issues.apache.org/jira/browse/SPARK-44293
             Project: Spark
          Issue Type: Bug
          Components: Connect
    Affects Versions: 3.5.0
            Reporter: Venkata Sai Akhil Gudesa


When attempting to use a custom JAR in a Spark Connect session, the tasks fail 
due to the following error:
{code:java}
23/07/03 17:00:15 INFO Executor: Fetching 
spark://ip-10-110-22-170.us-west-2.compute.internal:43743/artifacts/d9548b02-ff3b-4278-ab52-aef5d1fc724e//home/venkata.gudesa/spark/artifacts/spark-d6141194-c487-40fd-ba40-444d922808ea/d9548b02-ff3b-4278-ab52-aef5d1fc724e/jars/TestHelloV2.jar
 with timestamp 0 23/07/03 17:00:15 ERROR Executor: Exception in task 6.0 in 
stage 4.0 (TID 55) java.lang.RuntimeException: Stream 
'/artifacts/d9548b02-ff3b-4278-ab52-aef5d1fc724e//home/venkata.gudesa/spark/artifacts/spark-d6141194-c487-40fd-ba40-444d922808ea/d9548b02-ff3b-4278-ab52-aef5d1fc724e/jars/TestHelloV2.jar'
 was not found.     at 
org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:260)
     at 
org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:142)
     at 
org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53)
     at 
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
 {code}
 

*Root Cause: The URI for the JAR file is invalid.* (Instead of the URI being in 
the form of {_}/artifacts/<sessionUUID>/jars/<JAR>{_}, its instead 
\{_}/artifacts/<sessionUUID>/<absolutePath>{_})



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to