Shardul Mahadik created SPARK-34472:
---------------------------------------

             Summary: SparkContext.addJar with an ivy path fails in cluster 
mode with a custom ivySettings file
                 Key: SPARK-34472
                 URL: https://issues.apache.org/jira/browse/SPARK-34472
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.2.0
            Reporter: Shardul Mahadik


SPARK-33084 introduced support for Ivy paths in {{sc.addJar}} or Spark SQL 
{{ADD JAR}}. If we use a custom ivySettings file using 
{{spark.jars.ivySettings}}, it is loaded at 
[https://github.com/apache/spark/blob/b26e7b510bbaee63c4095ab47e75ff2a70e377d7/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L1280.]
 However, this file is only accessible on the client machine. In cluster mode, 
this file is not available on the driver and so {{addJar}} fails.

{code:sh}
spark-submit --master yarn --deploy-mode cluster --class IvyAddJarExample 
--conf spark.jars.ivySettings=/path/to/ivySettings.xml example.jar
{code}

{code}
java.lang.IllegalArgumentException: requirement failed: Ivy settings file 
/path/to/ivySettings.xml does not exist
        at scala.Predef$.require(Predef.scala:281)
        at 
org.apache.spark.deploy.SparkSubmitUtils$.loadIvySettings(SparkSubmit.scala:1331)
        at 
org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:176)
        at 
org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:156)
        at 
org.apache.spark.sql.internal.SessionResourceLoader.resolveJars(SessionState.scala:166)
        at 
org.apache.spark.sql.hive.HiveSessionResourceLoader.addJar(HiveSessionStateBuilder.scala:133)
        at 
org.apache.spark.sql.execution.command.AddJarCommand.run(resources.scala:40)
 {code}

We should ship the ivySettings file to the driver so that {{addJar}} is able to 
find it.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to