Derp, one caveat to my "solution": I guess Spark doesn't use Kryo for
Function serde :(
On Fri, Sep 19, 2014 at 12:44 AM, Paul Wais wrote:
> Well it looks like this is indeed a protobuf issue. Poked a little more
> with Kryo. Since protobuf messages are serializable, I tried just making
> Kryo
Well it looks like this is indeed a protobuf issue. Poked a little more
with Kryo. Since protobuf messages are serializable, I tried just making
Kryo use the JavaSerializer for my messages. The resulting stack trace
made it look like protobuf GeneratedMessageLite is actually using the
classloade
It turns out Kyro doesn't play well with protobuf. Out of the box I see:
com.esotericsoftware.kryo.KryoException: java.lang.UnsupportedOperationException
Serialization trace:
extra_ (com.foo.bar.MyMessage)
com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.read(FieldSeria
hmm would using kyro help me here?
On Thursday, September 18, 2014, Paul Wais wrote:
> Ah, can one NOT create an RDD of any arbitrary Serializable type? It
> looks like I might be getting bitten by the same
> "java.io.ObjectInputStream uses root class loader only" bugs mentioned
> in:
>
> *
Ah, can one NOT create an RDD of any arbitrary Serializable type? It
looks like I might be getting bitten by the same
"java.io.ObjectInputStream uses root class loader only" bugs mentioned
in:
*
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-ClassNotFoundException-td3259.html
* ht
Well, it looks like Spark is just not loading my code into the
driver/executors E.g.:
List foo = JavaRDD bars.map(
new Function< MyMessage, String>() {
{
System.err.println("classpath: " +
System.getProperty("java.class.path"));
CodeSource src =
com.google.protobuf.Ge
Dear List,
I'm writing an application where I have RDDs of protobuf messages.
When I run the app via bin/spar-submit with --master local
--driver-class-path path/to/my/uber.jar, Spark is able to
ser/deserialize the messages correctly.
However, if I run WITHOUT --driver-class-path path/to/my/uber.