I noticed that running the following code results in the process hanging
forever waiting for the Job to complete.
It seems the exception never propagates to the caller.

Should a bug be filed on this?

- Paul



import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.List;

import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;

public class SparkSerializationTest {

public static void main(String[] args) {
JavaSparkContext context = new JavaSparkContext("local[3]", "test");
List<MyObject> list = new ArrayList<>();
list.add(new MyObject());
JavaRDD<MyObject> rdd = context.parallelize(list);
rdd.saveAsObjectFile("/tmp/sparkserializationtest");
}

private static final class MyObject implements Serializable {

private static final long serialVersionUID = 1L;

private void readObject(ObjectInputStream in) throws IOException,
ClassNotFoundException {
}

private void writeObject(ObjectOutputStream out) throws IOException {
throw new RuntimeException();
}

}
}

Reply via email to