Hi,
I'm creating scalatest tests for my Spark programs. I typically read data
from Amazon S3. When I run them using master=local everything works.
However, if I start an Amazon EC2 cluster and use that as the master, I get
EOFExceptions.

e.g.
mvn test -Dsuites="package.MyTest" -DargLine=-Dspark.master=local
works, but:

mvn test -Dsuites="package.MyTest" -DargLine=-Dspark.master=$( cat
/root/spark-ec2/cluster-url )
doesn't work.

The EOFException is like this:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in
stage 1.0 failed 4 times, most recent failure: Lost task 1.3 in stage 1.0
(TID 566, ip-10-46
-19-223.us-west-2.compute.internal): java.io.EOFException
        at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2744)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1032)
        at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:68)
        at
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:106)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:258)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:250)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:280)
        at
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:75)
        at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1138)
        at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)


Any ideas? Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-program-testing-using-scalatest-and-maven-cluster-master-exception-tp23552.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to