[ 
https://issues.apache.org/jira/browse/SPARK-6069?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14341689#comment-14341689
 ] 

ASF GitHub Bot commented on SPARK-6069:
---------------------------------------

Github user pferrel commented on the pull request:

    https://github.com/apache/mahout/pull/74#issuecomment-76536731
  
    This seems to be a bug in Spark 1.2.1 SPARK-6069
    
    Work around is to add the following either to your SparkConf in your app or 
-D:spark.executor.extraClassPath=/Users/pat/mahout/spark/target/mahout-spark_2.10-1.0-SNAPSHOT-dependency-reduced.jar
    
    To the mahout spark-xyz driver, where the jar contains any class that needs 
to be deserialized and the path exists on all workers.
    
    Therefor it currently looks like Spark 1.2.1 is not worth supporting.


> Deserialization Error ClassNotFound 
> ------------------------------------
>
>                 Key: SPARK-6069
>                 URL: https://issues.apache.org/jira/browse/SPARK-6069
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1
>         Environment: Standalone one worker cluster on localhost, or any 
> cluster
>            Reporter: Pat Ferrel
>
> A class is contained in the jars passed in when creating a context. It is 
> registered with kryo. The class (Guava HashBiMap) is created correctly from 
> an RDD and broadcast but the deserialization fails with ClassNotFound.
> The work around is to hard code the path to the jar and make it available on 
> all workers. Hard code because we are creating a library so there is no easy 
> way to pass in to the app something like:
> spark.executor.extraClassPath      /path/to/some.jar



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to