Hi,

I have this simple spark app.

public class AvroSparkTest {
    public static void main(String[] args) throws Exception {
        SparkConf sparkConf = new SparkConf()
                .setMaster("spark://niranda-ThinkPad-T540p:7077") //
("local[2]")
                .setAppName("avro-spark-test");

        JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
        JavaSQLContext sqlContext = new JavaSQLContext(sparkContext);
        JavaSchemaRDD episodes = AvroUtils.avroFile(sqlContext,

"/home/niranda/projects/avro-spark-test/src/test/resources/episodes.avro");
        episodes.printSchema();
        episodes.registerTempTable("avroTable");
        List<Row> result = sqlContext.sql("SELECT * FROM
avroTable").collect();

        for (Row row : result) {
            System.out.println(row.toString());
        }
    }
}


It works well with master being set to local.
But when I set master to a local spark server with 2 local workers, it
gives the following error.

15/01/06 10:00:55 INFO MemoryStore: ensureFreeSpace(177166) called with
curMem=0, maxMem=2004174766
15/01/06 10:00:55 INFO MemoryStore: Block broadcast_0 stored as values in
memory (estimated size 173.0 KB, free 1911.2 MB)
15/01/06 10:00:55 INFO MemoryStore: ensureFreeSpace(25502) called with
curMem=177166, maxMem=2004174766
15/01/06 10:00:55 INFO MemoryStore: Block broadcast_0_piece0 stored as
bytes in memory (estimated size 24.9 KB, free 1911.1 MB)
15/01/06 10:00:55 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory
on 10.100.5.109:41873 (size: 24.9 KB, free: 1911.3 MB)
15/01/06 10:00:55 INFO BlockManagerMaster: Updated info of block
broadcast_0_piece0
15/01/06 10:00:55 INFO SparkContext: Created broadcast 0 from hadoopFile at
AvroRelation.scala:45
15/01/06 10:00:55 INFO FileInputFormat: Total input paths to process : 1
15/01/06 10:00:55 INFO SparkContext: Starting job: collect at
SparkPlan.scala:84
15/01/06 10:00:55 INFO DAGScheduler: Got job 0 (collect at
SparkPlan.scala:84) with 2 output partitions (allowLocal=false)
15/01/06 10:00:55 INFO DAGScheduler: Final stage: Stage 0(collect at
SparkPlan.scala:84)
15/01/06 10:00:55 INFO DAGScheduler: Parents of final stage: List()
15/01/06 10:00:55 INFO DAGScheduler: Missing parents: List()
15/01/06 10:00:55 INFO DAGScheduler: Submitting Stage 0 (MappedRDD[6] at
map at SparkPlan.scala:84), which has no missing parents
15/01/06 10:00:55 INFO MemoryStore: ensureFreeSpace(4864) called with
curMem=202668, maxMem=2004174766
15/01/06 10:00:55 INFO MemoryStore: Block broadcast_1 stored as values in
memory (estimated size 4.8 KB, free 1911.1 MB)
15/01/06 10:00:55 INFO MemoryStore: ensureFreeSpace(3482) called with
curMem=207532, maxMem=2004174766
15/01/06 10:00:55 INFO MemoryStore: Block broadcast_1_piece0 stored as
bytes in memory (estimated size 3.4 KB, free 1911.1 MB)
15/01/06 10:00:55 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory
on 10.100.5.109:41873 (size: 3.4 KB, free: 1911.3 MB)
15/01/06 10:00:55 INFO BlockManagerMaster: Updated info of block
broadcast_1_piece0
15/01/06 10:00:55 INFO SparkContext: Created broadcast 1 from broadcast at
DAGScheduler.scala:838
15/01/06 10:00:55 INFO DAGScheduler: Submitting 2 missing tasks from Stage
0 (MappedRDD[6] at map at SparkPlan.scala:84)
15/01/06 10:00:55 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
15/01/06 10:00:55 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID
0, 10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID
1, 10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1,
10.100.5.109): java.io.EOFException
    at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2722)
    at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1009)
    at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
    at
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
    at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
    at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
    at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
    at
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
    at
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
    at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
    at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
    at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
    at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
    at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
    at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
    at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
    at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
    at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
    at java.lang.Thread.run(Thread.java:662)

15/01/06 10:00:55 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0)
on executor 10.100.5.109: java.io.EOFException (null) [duplicate 1]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID
2, 10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID
3, 10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2)
on executor 10.100.5.109: java.io.EOFException (null) [duplicate 2]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID
4, 10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID 3)
on executor 10.100.5.109: java.io.EOFException (null) [duplicate 3]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID
5, 10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 4)
on executor 10.100.5.109: java.io.EOFException (null) [duplicate 4]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID
6, 10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID 5)
on executor 10.100.5.109: java.io.EOFException (null) [duplicate 5]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 1.3 in stage 0.0 (TID
7, 10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID 6)
on executor 10.100.5.109: java.io.EOFException (null) [duplicate 6]
15/01/06 10:00:55 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times;
aborting job
15/01/06 10:00:55 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID 7)
on executor 10.100.5.109: java.io.EOFException (null) [duplicate 7]
15/01/06 10:00:55 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks
have all completed, from pool
15/01/06 10:00:55 INFO TaskSchedulerImpl: Cancelling stage 0
15/01/06 10:00:55 INFO DAGScheduler: Job 0 failed: collect at
SparkPlan.scala:84, took 0.127572 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 0.0 (TID 6, 10.100.5.109): java.io.EOFException
    at
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2722)
    at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1009)
    at
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
    at
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
    at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
    at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
    at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
    at
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
    at
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
    at
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
    at
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
    at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
    at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
    at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
    at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
    at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
    at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
    at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
    at java.lang.Thread.run(Thread.java:662)

Driver stacktrace:
    at org.apache.spark.scheduler.DAGScheduler.org
$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
    at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
    at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
    at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
    at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
    at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
    at scala.Option.foreach(Option.scala:236)
    at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
    at
org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
    at
org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
    at akka.actor.ActorCell.invoke(ActorCell.scala:487)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
    at akka.dispatch.Mailbox.run(Mailbox.scala:220)
    at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

what might be the reason behind this? I'm using intelliJ idea as my IDE

I have attached the logs herewith

Rgds
-- 
Niranda Perera
/usr/local/java/jdk1.7.0_55/bin/java -Didea.launcher.port=7532 
-Didea.launcher.bin.path=/home/niranda/software/idea-IU-135.1306/bin 
-Dfile.encoding=UTF-8 -classpath 
/usr/local/java/jdk1.7.0_55/jre/lib/jce.jar:/usr/local/java/jdk1.7.0_55/jre/lib/javaws.jar:/usr/local/java/jdk1.7.0_55/jre/lib/rt.jar:/usr/local/java/jdk1.7.0_55/jre/lib/management-agent.jar:/usr/local/java/jdk1.7.0_55/jre/lib/jfxrt.jar:/usr/local/java/jdk1.7.0_55/jre/lib/plugin.jar:/usr/local/java/jdk1.7.0_55/jre/lib/jfr.jar:/usr/local/java/jdk1.7.0_55/jre/lib/jsse.jar:/usr/local/java/jdk1.7.0_55/jre/lib/resources.jar:/usr/local/java/jdk1.7.0_55/jre/lib/deploy.jar:/usr/local/java/jdk1.7.0_55/jre/lib/charsets.jar:/usr/local/java/jdk1.7.0_55/jre/lib/ext/sunec.jar:/usr/local/java/jdk1.7.0_55/jre/lib/ext/dnsns.jar:/usr/local/java/jdk1.7.0_55/jre/lib/ext/sunjce_provider.jar:/usr/local/java/jdk1.7.0_55/jre/lib/ext/zipfs.jar:/usr/local/java/jdk1.7.0_55/jre/lib/ext/sunpkcs11.jar:/usr/local/java/jdk1.7.0_55/jre/lib/ext/localedata.jar:/home/niranda/projects/avro-spark-test/target/classes:/home/niranda/.m2/repository/com/databricks/spark-avro_2.10/0.1/spark-avro_2.10-0.1.jar:/home/niranda/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar:/home/niranda/.m2/repository/org/apache/avro/avro/1.7.7/avro-1.7.7.jar:/home/niranda/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/home/niranda/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/home/niranda/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/home/niranda/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.6/snappy-java-1.1.1.6.jar:/home/niranda/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/home/niranda/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/home/niranda/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar:/home/niranda/.m2/repository/org/apache/avro/avro-mapred/1.7.7/avro-mapred-1.7.7.jar:/home/niranda/.m2/repository/org/apache/spark/spark-sql_2.10/1.2.0/spark-sql_2.10-1.2.0.jar:/home/niranda/.m2/repository/org/apache/spark/spark-core_2.10/1.2.0/spark-core_2.10-1.2.0.jar:/home/niranda/.m2/repository/com/twitter/chill_2.10/0.5.0/chill_2.10-0.5.0.jar:/home/niranda/.m2/repository/com/twitter/chill-java/0.5.0/chill-java-0.5.0.jar:/home/niranda/.m2/repository/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar:/home/niranda/.m2/repository/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar:/home/niranda/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/home/niranda/.m2/repository/org/objenesis/objenesis/1.2/objenesis-1.2.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-client/2.2.0/hadoop-client-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-common/2.2.0/hadoop-common-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-annotations/2.2.0/hadoop-annotations-2.2.0.jar:/home/niranda/.m2/repository/com/google/guava/guava/14.0.1/guava-14.0.1.jar:/home/niranda/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/home/niranda/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/home/niranda/.m2/repository/org/apache/commons/commons-math/2.1/commons-math-2.1.jar:/home/niranda/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/home/niranda/.m2/repository/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar:/home/niranda/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar:/home/niranda/.m2/repository/commons-codec/commons-codec/1.5/commons-codec-1.5.jar:/home/niranda/.m2/repository/commons-io/commons-io/2.1/commons-io-2.1.jar:/home/niranda/.m2/repository/commons-net/commons-net/2.2/commons-net-2.2.jar:/home/niranda/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/home/niranda/.m2/repository/commons-lang/commons-lang/2.5/commons-lang-2.5.jar:/home/niranda/.m2/repository/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar:/home/niranda/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/home/niranda/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/home/niranda/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/home/niranda/.m2/repository/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar:/home/niranda/.m2/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar:/home/niranda/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-auth/2.2.0/hadoop-auth-2.2.0.jar:/home/niranda/.m2/repository/org/apache/zookeeper/zookeeper/3.4.5/zookeeper-3.4.5.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.2.0/hadoop-hdfs-2.2.0.jar:/home/niranda/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-app/2.2.0/hadoop-mapreduce-client-app-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/2.2.0/hadoop-mapreduce-client-common-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-yarn-common/2.2.0/hadoop-yarn-common-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-yarn-api/2.2.0/hadoop-yarn-api-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-yarn-client/2.2.0/hadoop-yarn-client-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.2.0/hadoop-mapreduce-client-core-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/2.2.0/hadoop-yarn-server-common-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-shuffle/2.2.0/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/niranda/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/2.2.0/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/niranda/.m2/repository/org/apache/spark/spark-network-common_2.10/1.2.0/spark-network-common_2.10-1.2.0.jar:/home/niranda/.m2/repository/io/netty/netty-all/4.0.23.Final/netty-all-4.0.23.Final.jar:/home/niranda/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/niranda/.m2/repository/org/apache/spark/spark-network-shuffle_2.10/1.2.0/spark-network-shuffle_2.10-1.2.0.jar:/home/niranda/.m2/repository/net/java/dev/jets3t/jets3t/0.7.1/jets3t-0.7.1.jar:/home/niranda/.m2/repository/org/apache/curator/curator-recipes/2.4.0/curator-recipes-2.4.0.jar:/home/niranda/.m2/repository/org/apache/curator/curator-framework/2.4.0/curator-framework-2.4.0.jar:/home/niranda/.m2/repository/org/apache/curator/curator-client/2.4.0/curator-client-2.4.0.jar:/home/niranda/.m2/repository/jline/jline/0.9.94/jline-0.9.94.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-plus/8.1.14.v20131031/jetty-plus-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/orbit/javax.transaction/1.1.1.v201105210645/javax.transaction-1.1.1.v201105210645.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-webapp/8.1.14.v20131031/jetty-webapp-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-xml/8.1.14.v20131031/jetty-xml-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-util/8.1.14.v20131031/jetty-util-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-servlet/8.1.14.v20131031/jetty-servlet-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-security/8.1.14.v20131031/jetty-security-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-server/8.1.14.v20131031/jetty-server-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-continuation/8.1.14.v20131031/jetty-continuation-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-http/8.1.14.v20131031/jetty-http-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-io/8.1.14.v20131031/jetty-io-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/jetty-jndi/8.1.14.v20131031/jetty-jndi-8.1.14.v20131031.jar:/home/niranda/.m2/repository/org/eclipse/jetty/orbit/javax.mail.glassfish/1.4.1.v201005082020/javax.mail.glassfish-1.4.1.v201005082020.jar:/home/niranda/.m2/repository/org/eclipse/jetty/orbit/javax.activation/1.1.0.v201105071233/javax.activation-1.1.0.v201105071233.jar:/home/niranda/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/home/niranda/.m2/repository/org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar:/home/niranda/.m2/repository/org/slf4j/jul-to-slf4j/1.7.5/jul-to-slf4j-1.7.5.jar:/home/niranda/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.5/jcl-over-slf4j-1.7.5.jar:/home/niranda/.m2/repository/com/ning/compress-lzf/1.0.0/compress-lzf-1.0.0.jar:/home/niranda/.m2/repository/net/jpountz/lz4/lz4/1.2.0/lz4-1.2.0.jar:/home/niranda/.m2/repository/org/roaringbitmap/RoaringBitmap/0.4.5/RoaringBitmap-0.4.5.jar:/home/niranda/.m2/repository/org/spark-project/akka/akka-remote_2.10/2.3.4-spark/akka-remote_2.10-2.3.4-spark.jar:/home/niranda/.m2/repository/org/spark-project/akka/akka-actor_2.10/2.3.4-spark/akka-actor_2.10-2.3.4-spark.jar:/home/niranda/.m2/repository/com/typesafe/config/1.2.1/config-1.2.1.jar:/home/niranda/.m2/repository/io/netty/netty/3.8.0.Final/netty-3.8.0.Final.jar:/home/niranda/.m2/repository/org/spark-project/protobuf/protobuf-java/2.5.0-spark/protobuf-java-2.5.0-spark.jar:/home/niranda/.m2/repository/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/home/niranda/.m2/repository/org/spark-project/akka/akka-slf4j_2.10/2.3.4-spark/akka-slf4j_2.10-2.3.4-spark.jar:/home/niranda/.m2/repository/org/json4s/json4s-jackson_2.10/3.2.10/json4s-jackson_2.10-3.2.10.jar:/home/niranda/.m2/repository/org/json4s/json4s-core_2.10/3.2.10/json4s-core_2.10-3.2.10.jar:/home/niranda/.m2/repository/org/json4s/json4s-ast_2.10/3.2.10/json4s-ast_2.10-3.2.10.jar:/home/niranda/.m2/repository/org/scala-lang/scalap/2.10.0/scalap-2.10.0.jar:/home/niranda/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar:/home/niranda/.m2/repository/org/scala-lang/scala-reflect/2.10.4/scala-reflect-2.10.4.jar:/home/niranda/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.3.0/jackson-databind-2.3.0.jar:/home/niranda/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.3.0/jackson-annotations-2.3.0.jar:/home/niranda/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.3.0/jackson-core-2.3.0.jar:/home/niranda/.m2/repository/org/apache/mesos/mesos/0.18.1/mesos-0.18.1-shaded-protobuf.jar:/home/niranda/.m2/repository/com/clearspring/analytics/stream/2.7.0/stream-2.7.0.jar:/home/niranda/.m2/repository/com/codahale/metrics/metrics-core/3.0.0/metrics-core-3.0.0.jar:/home/niranda/.m2/repository/com/codahale/metrics/metrics-jvm/3.0.0/metrics-jvm-3.0.0.jar:/home/niranda/.m2/repository/com/codahale/metrics/metrics-json/3.0.0/metrics-json-3.0.0.jar:/home/niranda/.m2/repository/com/codahale/metrics/metrics-graphite/3.0.0/metrics-graphite-3.0.0.jar:/home/niranda/.m2/repository/org/tachyonproject/tachyon-client/0.5.0/tachyon-client-0.5.0.jar:/home/niranda/.m2/repository/org/tachyonproject/tachyon/0.5.0/tachyon-0.5.0.jar:/home/niranda/.m2/repository/org/spark-project/pyrolite/2.0.1/pyrolite-2.0.1.jar:/home/niranda/.m2/repository/net/sf/py4j/py4j/0.8.2.1/py4j-0.8.2.1.jar:/home/niranda/.m2/repository/org/apache/spark/spark-catalyst_2.10/1.2.0/spark-catalyst_2.10-1.2.0.jar:/home/niranda/.m2/repository/org/scalamacros/quasiquotes_2.10/2.0.1/quasiquotes_2.10-2.0.1.jar:/home/niranda/.m2/repository/com/twitter/parquet-column/1.6.0rc3/parquet-column-1.6.0rc3.jar:/home/niranda/.m2/repository/com/twitter/parquet-common/1.6.0rc3/parquet-common-1.6.0rc3.jar:/home/niranda/.m2/repository/com/twitter/parquet-encoding/1.6.0rc3/parquet-encoding-1.6.0rc3.jar:/home/niranda/.m2/repository/com/twitter/parquet-generator/1.6.0rc3/parquet-generator-1.6.0rc3.jar:/home/niranda/.m2/repository/com/twitter/parquet-hadoop/1.6.0rc3/parquet-hadoop-1.6.0rc3.jar:/home/niranda/.m2/repository/com/twitter/parquet-format/2.2.0-rc1/parquet-format-2.2.0-rc1.jar:/home/niranda/.m2/repository/com/twitter/parquet-jackson/1.6.0rc3/parquet-jackson-1.6.0rc3.jar:/home/niranda/software/idea-IU-135.1306/lib/idea_rt.jar
 com.intellij.rt.execution.application.AppMain AvroSparkTest
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/01/06 10:00:47 WARN Utils: Your hostname, niranda-ThinkPad-T540p resolves to 
a loopback address: 127.0.1.1; using 10.100.5.109 instead (on interface wlan0)
15/01/06 10:00:47 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another 
address
15/01/06 10:00:47 INFO SecurityManager: Changing view acls to: niranda
15/01/06 10:00:47 INFO SecurityManager: Changing modify acls to: niranda
15/01/06 10:00:47 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(niranda); users 
with modify permissions: Set(niranda)
15/01/06 10:00:47 INFO Slf4jLogger: Slf4jLogger started
15/01/06 10:00:47 INFO Remoting: Starting remoting
15/01/06 10:00:47 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkDriver@10.100.5.109:50232]
15/01/06 10:00:47 INFO Utils: Successfully started service 'sparkDriver' on 
port 50232.
15/01/06 10:00:47 INFO SparkEnv: Registering MapOutputTracker
15/01/06 10:00:47 INFO SparkEnv: Registering BlockManagerMaster
15/01/06 10:00:47 INFO DiskBlockManager: Created local directory at 
/tmp/spark-local-20150106100047-a1c0
15/01/06 10:00:47 INFO MemoryStore: MemoryStore started with capacity 1911.3 MB
15/01/06 10:00:47 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
15/01/06 10:00:48 INFO HttpFileServer: HTTP File server directory is 
/tmp/spark-a2e1771c-7d50-415f-87bd-ccb0fcf15e0f
15/01/06 10:00:48 INFO HttpServer: Starting HTTP Server
15/01/06 10:00:48 INFO Utils: Successfully started service 'HTTP file server' 
on port 45501.
15/01/06 10:00:53 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
15/01/06 10:00:53 INFO SparkUI: Started SparkUI at http://10.100.5.109:4040
15/01/06 10:00:53 INFO AppClient$ClientActor: Connecting to master 
spark://niranda-ThinkPad-T540p:7077...
15/01/06 10:00:53 INFO SparkDeploySchedulerBackend: Connected to Spark cluster 
with app ID app-20150106100053-0003
15/01/06 10:00:53 INFO AppClient$ClientActor: Executor added: 
app-20150106100053-0003/0 on worker-20150106093444-10.100.5.109-49287 
(10.100.5.109:49287) with 8 cores
15/01/06 10:00:53 INFO SparkDeploySchedulerBackend: Granted executor ID 
app-20150106100053-0003/0 on hostPort 10.100.5.109:49287 with 8 cores, 512.0 MB 
RAM
15/01/06 10:00:53 INFO AppClient$ClientActor: Executor added: 
app-20150106100053-0003/1 on worker-20150106093441-10.100.5.109-35448 
(10.100.5.109:35448) with 8 cores
15/01/06 10:00:53 INFO SparkDeploySchedulerBackend: Granted executor ID 
app-20150106100053-0003/1 on hostPort 10.100.5.109:35448 with 8 cores, 512.0 MB 
RAM
15/01/06 10:00:53 INFO AppClient$ClientActor: Executor updated: 
app-20150106100053-0003/0 is now LOADING
15/01/06 10:00:53 INFO AppClient$ClientActor: Executor updated: 
app-20150106100053-0003/1 is now LOADING
15/01/06 10:00:53 INFO AppClient$ClientActor: Executor updated: 
app-20150106100053-0003/0 is now RUNNING
15/01/06 10:00:53 INFO AppClient$ClientActor: Executor updated: 
app-20150106100053-0003/1 is now RUNNING
15/01/06 10:00:53 INFO NettyBlockTransferService: Server created on 41873
15/01/06 10:00:53 INFO BlockManagerMaster: Trying to register BlockManager
15/01/06 10:00:53 INFO BlockManagerMasterActor: Registering block manager 
10.100.5.109:41873 with 1911.3 MB RAM, BlockManagerId(<driver>, 10.100.5.109, 
41873)
15/01/06 10:00:53 INFO BlockManagerMaster: Registered BlockManager
15/01/06 10:00:53 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready 
for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
15/01/06 10:00:54 INFO SparkDeploySchedulerBackend: Registered executor: 
Actor[akka.tcp://sparkExecutor@10.100.5.109:57831/user/Executor#-1944976245] 
with ID 0
15/01/06 10:00:54 INFO SparkDeploySchedulerBackend: Registered executor: 
Actor[akka.tcp://sparkExecutor@10.100.5.109:55443/user/Executor#-1375368492] 
with ID 1
15/01/06 10:00:55 INFO BlockManagerMasterActor: Registering block manager 
10.100.5.109:59488 with 265.0 MB RAM, BlockManagerId(0, 10.100.5.109, 59488)
15/01/06 10:00:55 INFO BlockManagerMasterActor: Registering block manager 
10.100.5.109:60996 with 265.0 MB RAM, BlockManagerId(1, 10.100.5.109, 60996)
root
 |-- title: string (nullable = false)
 |-- air_date: string (nullable = false)
 |-- doctor: integer (nullable = false)

15/01/06 10:00:55 INFO MemoryStore: ensureFreeSpace(177166) called with 
curMem=0, maxMem=2004174766
15/01/06 10:00:55 INFO MemoryStore: Block broadcast_0 stored as values in 
memory (estimated size 173.0 KB, free 1911.2 MB)
15/01/06 10:00:55 INFO MemoryStore: ensureFreeSpace(25502) called with 
curMem=177166, maxMem=2004174766
15/01/06 10:00:55 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in 
memory (estimated size 24.9 KB, free 1911.1 MB)
15/01/06 10:00:55 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 
10.100.5.109:41873 (size: 24.9 KB, free: 1911.3 MB)
15/01/06 10:00:55 INFO BlockManagerMaster: Updated info of block 
broadcast_0_piece0
15/01/06 10:00:55 INFO SparkContext: Created broadcast 0 from hadoopFile at 
AvroRelation.scala:45
15/01/06 10:00:55 INFO FileInputFormat: Total input paths to process : 1
15/01/06 10:00:55 INFO SparkContext: Starting job: collect at SparkPlan.scala:84
15/01/06 10:00:55 INFO DAGScheduler: Got job 0 (collect at SparkPlan.scala:84) 
with 2 output partitions (allowLocal=false)
15/01/06 10:00:55 INFO DAGScheduler: Final stage: Stage 0(collect at 
SparkPlan.scala:84)
15/01/06 10:00:55 INFO DAGScheduler: Parents of final stage: List()
15/01/06 10:00:55 INFO DAGScheduler: Missing parents: List()
15/01/06 10:00:55 INFO DAGScheduler: Submitting Stage 0 (MappedRDD[6] at map at 
SparkPlan.scala:84), which has no missing parents
15/01/06 10:00:55 INFO MemoryStore: ensureFreeSpace(4864) called with 
curMem=202668, maxMem=2004174766
15/01/06 10:00:55 INFO MemoryStore: Block broadcast_1 stored as values in 
memory (estimated size 4.8 KB, free 1911.1 MB)
15/01/06 10:00:55 INFO MemoryStore: ensureFreeSpace(3482) called with 
curMem=207532, maxMem=2004174766
15/01/06 10:00:55 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in 
memory (estimated size 3.4 KB, free 1911.1 MB)
15/01/06 10:00:55 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 
10.100.5.109:41873 (size: 3.4 KB, free: 1911.3 MB)
15/01/06 10:00:55 INFO BlockManagerMaster: Updated info of block 
broadcast_1_piece0
15/01/06 10:00:55 INFO SparkContext: Created broadcast 1 from broadcast at 
DAGScheduler.scala:838
15/01/06 10:00:55 INFO DAGScheduler: Submitting 2 missing tasks from Stage 0 
(MappedRDD[6] at map at SparkPlan.scala:84)
15/01/06 10:00:55 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
15/01/06 10:00:55 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 
10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 
10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, 
10.100.5.109): java.io.EOFException
        at 
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2722)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1009)
        at 
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
        at 
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at 
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:662)

15/01/06 10:00:55 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) on 
executor 10.100.5.109: java.io.EOFException (null) [duplicate 1]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 2, 
10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID 3, 
10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2) on 
executor 10.100.5.109: java.io.EOFException (null) [duplicate 2]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 4, 
10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID 3) on 
executor 10.100.5.109: java.io.EOFException (null) [duplicate 3]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID 5, 
10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 4) on 
executor 10.100.5.109: java.io.EOFException (null) [duplicate 4]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID 6, 
10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID 5) on 
executor 10.100.5.109: java.io.EOFException (null) [duplicate 5]
15/01/06 10:00:55 INFO TaskSetManager: Starting task 1.3 in stage 0.0 (TID 7, 
10.100.5.109, PROCESS_LOCAL, 1340 bytes)
15/01/06 10:00:55 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID 6) on 
executor 10.100.5.109: java.io.EOFException (null) [duplicate 6]
15/01/06 10:00:55 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; 
aborting job
15/01/06 10:00:55 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID 7) on 
executor 10.100.5.109: java.io.EOFException (null) [duplicate 7]
15/01/06 10:00:55 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have 
all completed, from pool 
15/01/06 10:00:55 INFO TaskSchedulerImpl: Cancelling stage 0
15/01/06 10:00:55 INFO DAGScheduler: Job 0 failed: collect at 
SparkPlan.scala:84, took 0.127572 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to 
stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost 
task 0.3 in stage 0.0 (TID 6, 10.100.5.109): java.io.EOFException
        at 
java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2722)
        at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1009)
        at 
org.apache.hadoop.io.DataOutputBuffer$Buffer.write(DataOutputBuffer.java:63)
        at 
org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:101)
        at org.apache.hadoop.io.UTF8.readChars(UTF8.java:216)
        at org.apache.hadoop.io.UTF8.readString(UTF8.java:208)
        at org.apache.hadoop.mapred.FileSplit.readFields(FileSplit.java:87)
        at 
org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:237)
        at 
org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:66)
        at 
org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at 
org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:662)

Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
        at scala.Option.foreach(Option.scala:236)
        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
        at akka.actor.ActorCell.invoke(ActorCell.scala:487)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
        at akka.dispatch.Mailbox.run(Mailbox.scala:220)
        at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

Process finished with exit code 1

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to