Actually, after some digging, I did find a JIRA for it: SPARK-5470.
The fix for this has gone into master, but it isn't in 1.2.

On Mon, May 4, 2015 at 2:47 PM, Imran Rashid <iras...@cloudera.com> wrote:
> Oh, this seems like a real pain.  You should file a jira, I didn't see an
> open issue -- if nothing else just to document the issue.
>
> As you've noted, the problem is that the serializer is created immediately
> in the executors, right when the SparkEnv is created, but the other jars
> aren't downloaded later.  I think you could workaround with some combination
> of pushing the jars to the cluster manually, and then using
> spark.executor.extraClassPath
>
> On Wed, Apr 29, 2015 at 6:42 PM, Akshat Aranya <aara...@gmail.com> wrote:
>>
>> Hi,
>>
>> Is it possible to register kryo serialization for classes contained in
>> jars that are added with "spark.jars"?  In my experiment it doesn't seem to
>> work, likely because the class registration happens before the jar is
>> shipped to the executor and added to the classloader.  Here's the general
>> idea of what I want to do:
>>
>>    val sparkConf = new SparkConf(true)
>>       .set("spark.jars", "foo.jar")
>>       .setAppName("foo")
>>       .set("spark.serializer",
>> "org.apache.spark.serializer.KryoSerializer")
>>
>> // register classes contained in foo.jar
>>     sparkConf.registerKryoClasses(Array(
>>       classOf[com.foo.Foo],
>>       classOf[com.foo.Bar]))
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to