Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/3874#issuecomment-68698508
  
    This one always confuses me, but here's what I think I know:
    
    The compiled `Optional` in Spark won't have the "correct" (meaning, 
matching the Google Guava `Optional`) signatures on its methods since other 
Guava classes are shaded. It's there for the Spark API to compile against the 
Guava class in the package that the user app expects.
    
    Apps that uses the API method that uses `Optional` will be bundling Guava. 
Spark uses Guava 14, although in theory you can use any version... that still 
has the very few methods on `Optional` that Spark actually calls, I suppose. 
Because Spark will be using the user app's copy of `Optional` at runtime.
    
    You say you tried it and got `java.lang.NoClassDefFoundError: 
org/apache/spark/Partition` though. That's weird. `Optional` will be in a 
different classloader (?) but shouldn't refer to Spark classes. Right? if 
there's a problem it's somewhere in there, since that's where how I thought 
this works seems to not match your experience. Or else, there's something else 
subtly not-quite-right about how the user app is run here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to