[ 
https://issues.apache.org/jira/browse/SPARK-12666?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Josh Rosen updated SPARK-12666:
-------------------------------
    Description: 
Symptom:

I cloned the latest master of {{spark-redshift}}, then used {{sbt 
publishLocal}} to publish it to my Ivy cache. When I tried running 
{{./bin/spark-shell --packages 
com.databricks:spark-redshift_2.10:0.5.3-SNAPSHOT}} to load this dependency 
into {{spark-shell}}, I received the following cryptic error:

{code}
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: 
com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: configuration not found in 
com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: 'default'. It was required 
from org.apache.spark#spark-submit-parent;1.0 default]
        at 
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1009)
        at 
org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
{code}

I think the problem here is that Spark is declaring a dependency on the 
spark-redshift artifact using the {{default}} Ivy configuration. Based on my 
admittedly limited understanding of Ivy, the default configuration will be the 
only configuration defined in an Ivy artifact if that artifact defines no other 
configurations. Thus, for Maven artifacts I think the default configuration 
will end up mapping to Maven's regular JAR dependency (i.e. Maven artifacts 
don't declare Ivy configurations so they implicitly have the {{default}} 
configuration) but for Ivy artifacts I think we can run into trouble when 
loading artifacts which explicitly define their own configurations, since those 
artifacts might not have a configuration named {{default}}.

I spent a bit of time playing around with the SparkSubmit code to see if I 
could fix this but wasn't able to completely resolve the issue.

/cc [~brkyvz] (ping me offline and I can walk you through the repo in person, 
if you'd like)

  was:
Symptom:

I cloned the latest master of {{spark-redshift}}, then used {{sbt 
publishLocal}} to publish it to my Ivy cache. When I tried running 
{{./bin/spark-shell --packages 
com.databricks:spark-redshift_2.10:0.5.3-SNAPSHOT}} to load this dependency 
into {{spark-shell}}, I received the following cryptic error:

{code}
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: 
com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: configuration not found in 
com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: 'default'. It was required 
from org.apache.spark#spark-submit-parent;1.0 default]
        at 
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1009)
        at 
org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
{code}

I think the problem here is that Spark is declaring a dependency on the 
spark-redshift artifact using the {{default}} Ivy configuration. Based on my 
admittedly limited understanding of Ivy, the default configuration will be the 
only configuration defined in an Ivy artifact if that artifact defines no other 
configurations. Thus, for Maven artifacts I think the default configuration 
will end up mapping to Maven's regular JAR dependency but for Ivy artifacts I 
think we can run into trouble when loading artifacts which explicitly define 
their own configurations, since those artifacts might not have a configuration 
named {{default}}.

I spent a bit of time playing around with the SparkSubmit code to see if I 
could fix this but wasn't able to completely resolve the issue.

/cc [~brkyvz] (ping me offline and I can walk you through the repo in person, 
if you'd like)


> spark-shell --packages cannot load artifacts which are publishLocal'd by SBT
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-12666
>                 URL: https://issues.apache.org/jira/browse/SPARK-12666
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.5.1, 1.6.0
>            Reporter: Josh Rosen
>
> Symptom:
> I cloned the latest master of {{spark-redshift}}, then used {{sbt 
> publishLocal}} to publish it to my Ivy cache. When I tried running 
> {{./bin/spark-shell --packages 
> com.databricks:spark-redshift_2.10:0.5.3-SNAPSHOT}} to load this dependency 
> into {{spark-shell}}, I received the following cryptic error:
> {code}
> Exception in thread "main" java.lang.RuntimeException: [unresolved 
> dependency: com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: configuration 
> not found in com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: 'default'. It 
> was required from org.apache.spark#spark-submit-parent;1.0 default]
>       at 
> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1009)
>       at 
> org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}
> I think the problem here is that Spark is declaring a dependency on the 
> spark-redshift artifact using the {{default}} Ivy configuration. Based on my 
> admittedly limited understanding of Ivy, the default configuration will be 
> the only configuration defined in an Ivy artifact if that artifact defines no 
> other configurations. Thus, for Maven artifacts I think the default 
> configuration will end up mapping to Maven's regular JAR dependency (i.e. 
> Maven artifacts don't declare Ivy configurations so they implicitly have the 
> {{default}} configuration) but for Ivy artifacts I think we can run into 
> trouble when loading artifacts which explicitly define their own 
> configurations, since those artifacts might not have a configuration named 
> {{default}}.
> I spent a bit of time playing around with the SparkSubmit code to see if I 
> could fix this but wasn't able to completely resolve the issue.
> /cc [~brkyvz] (ping me offline and I can walk you through the repo in person, 
> if you'd like)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to