[ 
https://issues.apache.org/jira/browse/SPARK-5052?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14262740#comment-14262740
 ] 

Elmer Garduno commented on SPARK-5052:
--------------------------------------

As mentioned in the previous comment the method signatures get fixed when 
including the conflicting shaded classes in core/pom.xml and excluding them (in 
assembly/pom.xml). Here are the signatures for the classes generated on Spark 
1.2.0 and the signatures after fixing the dependencies, note how the 
{{or(org.spark-project.guava.common.base.Supplier)}} and 
{{transform(org.spark-project.guava.common.base.Function)}} have incorrect 
signatures. 

The problem is that If user code attempts to use this methods a 
{{NoSuchMethodError}} will be thrown as the classes on runtime have the 
incorrect signatures:

{code}
javap -classpath 
../spark-1.2.0-bin-hadoop1/lib/spark-assembly-1.2.0-hadoop1.0.4.jar 
com.google.common.base.Optional
Compiled from "Optional.java"
public abstract class com.google.common.base.Optional<T> implements 
java.io.Serializable {
  ...
  public abstract T or(org.spark-project.guava.common.base.Supplier<? extends 
T>);
  ...
  public abstract <V extends java/lang/Object> 
com.google.common.base.Optional<V> 
transform(org.spark-project.guava.common.base.Function<? super T, V>);
  ...
}
{code}

After adding the required classes to the pom files, the methods have the 
correct signatures:

{code}
javap -classpath 
assembly/target/scala-2.10/spark-assembly-1.3.0-SNAPSHOT-hadoop1.2.1.jar 
com.google.common.base.Optional
Compiled from "Optional.java"
public abstract class com.google.common.base.Optional<T> implements 
java.io.Serializable {
  ...
  public abstract com.google.common.base.Optional<T> 
or(com.google.common.base.Optional<? extends T>);
  ...
  public abstract <V extends java/lang/Object> 
com.google.common.base.Optional<V> transform(com.google.common.base.Function<? 
super T, V>);
  ...
}
{code}

Now the tricky part, adding only the two affected classes (Function, Supplier) 
generates runtime problems as they have downstream dependencies:

{noformat}
Exception in thread "main" java.lang.VerifyError: Bad return type
Exception Details:
  Location:
    
org/spark-project/guava/common/base/Equivalence.onResultOf(Lcom/google/common/base/Function;)Lorg/spark-project/guava/common/base/Equivalence;
 @9: areturn
  Reason:
    Type 'com/google/common/base/FunctionalEquivalence' (current frame, 
stack[0]) is not assignable to 
'org/spark-project/guava/common/base/Equivalence' (from method signature)
  Current Frame:
    bci: @9
    flags: { }
    locals: { 'org/spark-project/guava/common/base/Equivalence', 
'com/google/common/base/Function' }
    stack: { 'com/google/common/base/FunctionalEquivalence' }
  Bytecode:
    0000000: bb00 3059 2b2a b700 33b0

        at 
org.spark-project.guava.common.collect.MapMakerInternalMap$Strength$1.defaultEquivalence(MapMakerInternalMap.java:304)
        at 
org.spark-project.guava.common.collect.MapMaker.getKeyEquivalence(MapMaker.java:158)
        at 
org.spark-project.guava.common.collect.MapMakerInternalMap.<init>(MapMakerInternalMap.java:201)
        at 
org.spark-project.guava.common.collect.MapMaker.makeMap(MapMaker.java:506)
        at org.apache.spark.SparkEnv.<init>(SparkEnv.scala:77)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:337)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
        at 
com.google.cloud.genomics.spark.examples.GenomicsConf.newSparkContext(GenomicsConf.scala:57)
        at 
com.google.cloud.genomics.spark.examples.VariantsPcaDriver.<init>(VariantsPca.scala:59)
        at 
com.google.cloud.genomics.spark.examples.VariantsPcaDriver$.apply(VariantsPca.scala:53)
        at 
com.google.cloud.genomics.spark.examples.VariantsPcaDriver$.main(VariantsPca.scala:44)
        at 
com.google.cloud.genomics.spark.examples.VariantsPcaDriver.main(VariantsPca.scala)
{noformat}

The fix to the problem is to include all of {{com/google/common/base/*}} (or 
narrow it down manually to all the required deps), but that seems to defeat the 
purpose of this exclusions? Thoughts?



> com.google.common.base.Optional binary has a wrong method signatures
> --------------------------------------------------------------------
>
>                 Key: SPARK-5052
>                 URL: https://issues.apache.org/jira/browse/SPARK-5052
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Elmer Garduno
>
> PR https://github.com/apache/spark/pull/1813 shaded Guava jar file and moved 
> Guava classes to package org.spark-project.guava when Spark is built by Maven.
> When a user jar uses the actual com.google.common.base.Optional 
> transform(com.google.common.base.Function); method from Guava,  a 
> java.lang.NoSuchMethodError: 
> com.google.common.base.Optional.transform(Lcom/google/common/base/Function;)Lcom/google/common/base/Optional;
>  is thrown.
> The reason seems to be that the Optional class included on 
> spark-assembly-1.2.0-hadoop1.0.4.jar has an incorrect method signature that 
> includes the shaded class as an argument:
> Expected:
> javap -classpath 
> target/scala-2.10/googlegenomics-spark-examples-assembly-1.0.jar 
> com.google.common.base.Optional
>   public abstract <V extends java/lang/Object> 
> com.google.common.base.Optional<V> 
> transform(com.google.common.base.Function<? super T, V>);
> Found:
> javap -classpath lib/spark-assembly-1.2.0-hadoop1.0.4.jar 
> com.google.common.base.Optional
>   public abstract <V extends java/lang/Object> 
> com.google.common.base.Optional<V> 
> transform(org.spark-project.guava.common.base.Function<? super T, V>);



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to