this out put from std err will help?

Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
14/12/26 10:13:44 INFO CoarseGrainedExecutorBackend: Registered signal
handlers for [TERM, HUP, INT]
14/12/26 10:13:44 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
14/12/26 10:13:44 INFO SecurityManager: Changing view acls to: root
14/12/26 10:13:44 INFO SecurityManager: Changing modify acls to: root
14/12/26 10:13:44 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(root); users
with modify permissions: Set(root)
14/12/26 10:13:45 INFO Slf4jLogger: Slf4jLogger started
14/12/26 10:13:45 INFO Remoting: Starting remoting
14/12/26 10:13:45 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://driverpropsfetc...@ip-172-31-18-138.ap-southeast-1.compute.internal:44461]
14/12/26 10:13:45 INFO Utils: Successfully started service
'driverPropsFetcher' on port 44461.
14/12/26 10:13:45 INFO SecurityManager: Changing view acls to: root
14/12/26 10:13:45 INFO SecurityManager: Changing modify acls to: root
14/12/26 10:13:45 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(root); users
with modify permissions: Set(root)
14/12/26 10:13:45 INFO RemoteActorRefProvider$RemotingTerminator: Shutting
down remote daemon.
14/12/26 10:13:45 INFO RemoteActorRefProvider$RemotingTerminator: Remote
daemon shut down; proceeding with flushing remote transports.
14/12/26 10:13:45 INFO Slf4jLogger: Slf4jLogger started
14/12/26 10:13:45 INFO RemoteActorRefProvider$RemotingTerminator: Remoting
shut down.
14/12/26 10:13:45 INFO Remoting: Starting remoting
14/12/26 10:13:45 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkexecu...@ip-172-31-18-138.ap-southeast-1.compute.internal:60600]
14/12/26 10:13:45 INFO Utils: Successfully started service 'sparkExecutor'
on port 60600.
14/12/26 10:13:45 INFO CoarseGrainedExecutorBackend: Connecting to driver:
akka.tcp://sparkdri...@ip-172-31-18-138.ap-southeast-1.compute.internal:51708/user/CoarseGrainedScheduler
14/12/26 10:13:45 INFO WorkerWatcher: Connecting to worker
akka.tcp://sparkwor...@ip-172-31-18-138.ap-southeast-1.compute.internal:50757/user/Worker
14/12/26 10:13:45 INFO WorkerWatcher: Successfully connected to
akka.tcp://sparkwor...@ip-172-31-18-138.ap-southeast-1.compute.internal:50757/user/Worker
14/12/26 10:13:45 INFO CoarseGrainedExecutorBackend: Successfully registered
with driver
14/12/26 10:13:45 INFO SecurityManager: Changing view acls to: root
14/12/26 10:13:45 INFO SecurityManager: Changing modify acls to: root
14/12/26 10:13:45 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(root); users
with modify permissions: Set(root)
14/12/26 10:13:45 INFO AkkaUtils: Connecting to MapOutputTracker:
akka.tcp://sparkdri...@ip-172-31-18-138.ap-southeast-1.compute.internal:51708/user/MapOutputTracker
14/12/26 10:13:45 INFO AkkaUtils: Connecting to BlockManagerMaster:
akka.tcp://sparkdri...@ip-172-31-18-138.ap-southeast-1.compute.internal:51708/user/BlockManagerMaster
14/12/26 10:13:45 INFO DiskBlockManager: Created local directory at
/mnt/spark/spark-local-20141226101345-b3c7
14/12/26 10:13:45 INFO DiskBlockManager: Created local directory at
/mnt2/spark/spark-local-20141226101345-cad1
14/12/26 10:13:45 INFO MemoryStore: MemoryStore started with capacity 265.4
MB
14/12/26 10:13:46 INFO NettyBlockTransferService: Server created on 51895
14/12/26 10:13:46 INFO BlockManagerMaster: Trying to register BlockManager
14/12/26 10:13:46 INFO BlockManagerMaster: Registered BlockManager
14/12/26 10:13:46 INFO AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkdri...@ip-172-31-18-138.ap-southeast-1.compute.internal:51708/user/HeartbeatReceiver
14/12/26 10:13:46 INFO CoarseGrainedExecutorBackend: Got assigned task 0
14/12/26 10:13:46 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/play-json_2.10-2.4.0-M2.jar with timestamp
1419588821693
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/play-json_2.10-2.4.0-M2.jar to
/mnt/spark/-17002596021419588821693_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./play-json_2.10-2.4.0-M2.jar
to class loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/spark-cassandra-connector_2.10-1.1.0.jar
with timestamp 1419588821694
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/spark-cassandra-connector_2.10-1.1.0.jar to
/mnt/spark/-16568315171419588821694_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./spark-cassandra-connector_2.10-1.1.0.jar
to class loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/spark-streaming-kafka_2.10-1.2.0.jar with
timestamp 1419588821695
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/spark-streaming-kafka_2.10-1.2.0.jar to
/mnt/spark/9916914731419588821695_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./spark-streaming-kafka_2.10-1.2.0.jar
to class loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/aerospike-client-3.0.31.jar with timestamp
1419588821692
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/aerospike-client-3.0.31.jar to
/mnt/spark/15592725951419588821692_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./aerospike-client-3.0.31.jar
to class loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/algebird-core_2.10-0.8.2.jar with timestamp
1419588821695
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/algebird-core_2.10-0.8.2.jar to
/mnt/spark/-14325231431419588821695_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./algebird-core_2.10-0.8.2.jar
to class loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/cassandra-driver-core-2.1.3.jar with
timestamp 1419588821695
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/cassandra-driver-core-2.1.3.jar to
/mnt/spark/-11474505141419588821695_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./cassandra-driver-core-2.1.3.jar
to class loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/joda-time-2.5.jar with timestamp
1419588821694
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/joda-time-2.5.jar to
/mnt/spark/13573783741419588821694_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./joda-time-2.5.jar to class
loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/algebird-util_2.10-0.8.2.jar with timestamp
1419588821695
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/algebird-util_2.10-0.8.2.jar to
/mnt/spark/12828005661419588821695_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./algebird-util_2.10-0.8.2.jar
to class loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/jets3t-0.9.0.jar with timestamp
1419588821694
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/jets3t-0.9.0.jar to
/mnt/spark/10650148191419588821694_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./jets3t-0.9.0.jar to class
loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/guava-14.0.1.jar with timestamp
1419588821694
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/guava-14.0.1.jar to
/mnt/spark/-16571014251419588821694_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./guava-14.0.1.jar to class
loader
14/12/26 10:13:46 INFO Executor: Fetching
file:/root/persistent-hdfs/jars/joda-convert-1.7.jar with timestamp
1419588821694
14/12/26 10:13:46 INFO Utils: Copying
/root/persistent-hdfs/jars/joda-convert-1.7.jar to
/mnt/spark/1878107331419588821694_cache
14/12/26 10:13:46 INFO Executor: Adding
file:/root/spark/work/app-20141226101341-0000/0/./joda-convert-1.7.jar to
class loader
14/12/26 10:13:46 INFO TorrentBroadcast: Started reading broadcast variable
1
14/12/26 10:13:46 INFO MemoryStore: ensureFreeSpace(1611) called with
curMem=0, maxMem=278302556
14/12/26 10:13:46 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes
in memory (estimated size 1611.0 B, free 265.4 MB)
14/12/26 10:13:46 INFO BlockManagerMaster: Updated info of block
broadcast_1_piece0
14/12/26 10:13:46 INFO TorrentBroadcast: Reading broadcast variable 1 took
229 ms
14/12/26 10:13:46 INFO MemoryStore: ensureFreeSpace(2720) called with
curMem=1611, maxMem=278302556
14/12/26 10:13:47 INFO MemoryStore: Block broadcast_1 stored as values in
memory (estimated size 2.7 KB, free 265.4 MB)
14/12/26 10:13:47 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 1
14/12/26 10:13:47 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
14/12/26 10:13:47 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 2
14/12/26 10:13:47 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
14/12/26 10:13:47 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 3
14/12/26 10:13:47 INFO Executor: Running task 3.0 in stage 0.0 (TID 3)
14/12/26 10:13:47 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 4
14/12/26 10:13:47 INFO Executor: Running task 4.0 in stage 0.0 (TID 4)
14/12/26 10:13:47 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 5
14/12/26 10:13:47 INFO Executor: Running task 5.0 in stage 0.0 (TID 5)
14/12/26 10:13:47 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 6
14/12/26 10:13:47 INFO Executor: Running task 6.0 in stage 0.0 (TID 6)
14/12/26 10:13:47 INFO Executor: Finished task 6.0 in stage 0.0 (TID 6). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 7
14/12/26 10:13:47 INFO Executor: Running task 7.0 in stage 0.0 (TID 7)
14/12/26 10:13:47 INFO Executor: Finished task 7.0 in stage 0.0 (TID 7). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 8
14/12/26 10:13:47 INFO Executor: Running task 8.0 in stage 0.0 (TID 8)
14/12/26 10:13:47 INFO Executor: Finished task 8.0 in stage 0.0 (TID 8). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 9
14/12/26 10:13:47 INFO Executor: Running task 9.0 in stage 0.0 (TID 9)
14/12/26 10:13:47 INFO Executor: Finished task 9.0 in stage 0.0 (TID 9). 927
bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 10
14/12/26 10:13:47 INFO Executor: Running task 10.0 in stage 0.0 (TID 10)
14/12/26 10:13:47 INFO Executor: Finished task 10.0 in stage 0.0 (TID 10).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 11
14/12/26 10:13:47 INFO Executor: Running task 11.0 in stage 0.0 (TID 11)
14/12/26 10:13:47 INFO Executor: Finished task 11.0 in stage 0.0 (TID 11).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 12
14/12/26 10:13:47 INFO Executor: Running task 12.0 in stage 0.0 (TID 12)
14/12/26 10:13:47 INFO Executor: Finished task 12.0 in stage 0.0 (TID 12).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 13
14/12/26 10:13:47 INFO Executor: Running task 13.0 in stage 0.0 (TID 13)
14/12/26 10:13:47 INFO Executor: Finished task 13.0 in stage 0.0 (TID 13).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 14
14/12/26 10:13:47 INFO Executor: Running task 14.0 in stage 0.0 (TID 14)
14/12/26 10:13:47 INFO Executor: Finished task 14.0 in stage 0.0 (TID 14).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 15
14/12/26 10:13:47 INFO Executor: Running task 15.0 in stage 0.0 (TID 15)
14/12/26 10:13:47 INFO Executor: Finished task 15.0 in stage 0.0 (TID 15).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 16
14/12/26 10:13:47 INFO Executor: Running task 16.0 in stage 0.0 (TID 16)
14/12/26 10:13:47 INFO Executor: Finished task 16.0 in stage 0.0 (TID 16).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 17
14/12/26 10:13:47 INFO Executor: Running task 17.0 in stage 0.0 (TID 17)
14/12/26 10:13:47 INFO Executor: Finished task 17.0 in stage 0.0 (TID 17).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 18
14/12/26 10:13:47 INFO Executor: Running task 18.0 in stage 0.0 (TID 18)
14/12/26 10:13:47 INFO Executor: Finished task 18.0 in stage 0.0 (TID 18).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 19
14/12/26 10:13:47 INFO Executor: Running task 19.0 in stage 0.0 (TID 19)
14/12/26 10:13:47 INFO Executor: Finished task 19.0 in stage 0.0 (TID 19).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 20
14/12/26 10:13:47 INFO Executor: Running task 20.0 in stage 0.0 (TID 20)
14/12/26 10:13:47 INFO Executor: Finished task 20.0 in stage 0.0 (TID 20).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 21
14/12/26 10:13:47 INFO Executor: Running task 21.0 in stage 0.0 (TID 21)
14/12/26 10:13:47 INFO Executor: Finished task 21.0 in stage 0.0 (TID 21).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 22
14/12/26 10:13:47 INFO Executor: Running task 22.0 in stage 0.0 (TID 22)
14/12/26 10:13:47 INFO Executor: Finished task 22.0 in stage 0.0 (TID 22).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 23
14/12/26 10:13:47 INFO Executor: Running task 23.0 in stage 0.0 (TID 23)
14/12/26 10:13:47 INFO Executor: Finished task 23.0 in stage 0.0 (TID 23).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 24
14/12/26 10:13:47 INFO Executor: Running task 24.0 in stage 0.0 (TID 24)
14/12/26 10:13:47 INFO Executor: Finished task 24.0 in stage 0.0 (TID 24).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 25
14/12/26 10:13:47 INFO Executor: Running task 25.0 in stage 0.0 (TID 25)
14/12/26 10:13:47 INFO Executor: Finished task 25.0 in stage 0.0 (TID 25).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 26
14/12/26 10:13:47 INFO Executor: Running task 26.0 in stage 0.0 (TID 26)
14/12/26 10:13:47 INFO Executor: Finished task 26.0 in stage 0.0 (TID 26).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 27
14/12/26 10:13:47 INFO Executor: Running task 27.0 in stage 0.0 (TID 27)
14/12/26 10:13:47 INFO Executor: Finished task 27.0 in stage 0.0 (TID 27).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 28
14/12/26 10:13:47 INFO Executor: Running task 28.0 in stage 0.0 (TID 28)
14/12/26 10:13:47 INFO Executor: Finished task 28.0 in stage 0.0 (TID 28).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 29
14/12/26 10:13:47 INFO Executor: Running task 29.0 in stage 0.0 (TID 29)
14/12/26 10:13:47 INFO Executor: Finished task 29.0 in stage 0.0 (TID 29).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 30
14/12/26 10:13:47 INFO Executor: Running task 30.0 in stage 0.0 (TID 30)
14/12/26 10:13:47 INFO Executor: Finished task 30.0 in stage 0.0 (TID 30).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 31
14/12/26 10:13:47 INFO Executor: Running task 31.0 in stage 0.0 (TID 31)
14/12/26 10:13:47 INFO Executor: Finished task 31.0 in stage 0.0 (TID 31).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 32
14/12/26 10:13:47 INFO Executor: Running task 32.0 in stage 0.0 (TID 32)
14/12/26 10:13:47 INFO Executor: Finished task 32.0 in stage 0.0 (TID 32).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 33
14/12/26 10:13:47 INFO Executor: Running task 33.0 in stage 0.0 (TID 33)
14/12/26 10:13:47 INFO Executor: Finished task 33.0 in stage 0.0 (TID 33).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 34
14/12/26 10:13:47 INFO Executor: Running task 34.0 in stage 0.0 (TID 34)
14/12/26 10:13:47 INFO Executor: Finished task 34.0 in stage 0.0 (TID 34).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 35
14/12/26 10:13:47 INFO Executor: Running task 35.0 in stage 0.0 (TID 35)
14/12/26 10:13:47 INFO Executor: Finished task 35.0 in stage 0.0 (TID 35).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 36
14/12/26 10:13:47 INFO Executor: Running task 36.0 in stage 0.0 (TID 36)
14/12/26 10:13:47 INFO Executor: Finished task 36.0 in stage 0.0 (TID 36).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 37
14/12/26 10:13:47 INFO Executor: Running task 37.0 in stage 0.0 (TID 37)
14/12/26 10:13:47 INFO Executor: Finished task 37.0 in stage 0.0 (TID 37).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 38
14/12/26 10:13:47 INFO Executor: Running task 38.0 in stage 0.0 (TID 38)
14/12/26 10:13:47 INFO Executor: Finished task 38.0 in stage 0.0 (TID 38).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 39
14/12/26 10:13:47 INFO Executor: Running task 39.0 in stage 0.0 (TID 39)
14/12/26 10:13:47 INFO Executor: Finished task 39.0 in stage 0.0 (TID 39).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 40
14/12/26 10:13:47 INFO Executor: Running task 40.0 in stage 0.0 (TID 40)
14/12/26 10:13:47 INFO Executor: Finished task 40.0 in stage 0.0 (TID 40).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 41
14/12/26 10:13:47 INFO Executor: Running task 41.0 in stage 0.0 (TID 41)
14/12/26 10:13:47 INFO Executor: Finished task 41.0 in stage 0.0 (TID 41).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 42
14/12/26 10:13:47 INFO Executor: Running task 42.0 in stage 0.0 (TID 42)
14/12/26 10:13:47 INFO Executor: Finished task 42.0 in stage 0.0 (TID 42).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 43
14/12/26 10:13:47 INFO Executor: Running task 43.0 in stage 0.0 (TID 43)
14/12/26 10:13:47 INFO Executor: Finished task 43.0 in stage 0.0 (TID 43).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 44
14/12/26 10:13:47 INFO Executor: Running task 44.0 in stage 0.0 (TID 44)
14/12/26 10:13:47 INFO Executor: Finished task 44.0 in stage 0.0 (TID 44).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 45
14/12/26 10:13:47 INFO Executor: Running task 45.0 in stage 0.0 (TID 45)
14/12/26 10:13:47 INFO Executor: Finished task 45.0 in stage 0.0 (TID 45).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 46
14/12/26 10:13:47 INFO Executor: Running task 46.0 in stage 0.0 (TID 46)
14/12/26 10:13:47 INFO Executor: Finished task 46.0 in stage 0.0 (TID 46).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 47
14/12/26 10:13:47 INFO Executor: Running task 47.0 in stage 0.0 (TID 47)
14/12/26 10:13:47 INFO Executor: Finished task 47.0 in stage 0.0 (TID 47).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 48
14/12/26 10:13:47 INFO Executor: Running task 48.0 in stage 0.0 (TID 48)
14/12/26 10:13:47 INFO Executor: Finished task 48.0 in stage 0.0 (TID 48).
927 bytes result sent to driver
14/12/26 10:13:47 INFO CoarseGrainedExecutorBackend: Got assigned task 49
14/12/26 10:13:47 INFO Executor: Running task 49.0 in stage 0.0 (TID 49)
14/12/26 10:13:48 INFO Executor: Finished task 49.0 in stage 0.0 (TID 49).
927 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 50
14/12/26 10:13:48 INFO Executor: Running task 0.0 in stage 1.0 (TID 50)
14/12/26 10:13:48 INFO MapOutputTrackerWorker: Updating epoch to 1 and
clearing cache
14/12/26 10:13:48 INFO TorrentBroadcast: Started reading broadcast variable
2
14/12/26 10:13:48 INFO MemoryStore: ensureFreeSpace(1384) called with
curMem=4331, maxMem=278302556
14/12/26 10:13:48 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes
in memory (estimated size 1384.0 B, free 265.4 MB)
14/12/26 10:13:48 INFO BlockManagerMaster: Updated info of block
broadcast_2_piece0
14/12/26 10:13:48 INFO TorrentBroadcast: Reading broadcast variable 2 took
12 ms
14/12/26 10:13:48 INFO MemoryStore: ensureFreeSpace(2232) called with
curMem=5715, maxMem=278302556
14/12/26 10:13:48 INFO MemoryStore: Block broadcast_2 stored as values in
memory (estimated size 2.2 KB, free 265.4 MB)
14/12/26 10:13:48 INFO MapOutputTrackerWorker: Don't have map outputs for
shuffle 0, fetching them
14/12/26 10:13:48 INFO MapOutputTrackerWorker: Doing the fetch; tracker
actor =
Actor[akka.tcp://sparkdri...@ip-172-31-18-138.ap-southeast-1.compute.internal:51708/user/MapOutputTracker#-1035801541]
14/12/26 10:13:48 INFO MapOutputTrackerWorker: Got the output locations
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 7 ms
14/12/26 10:13:48 INFO Executor: Finished task 0.0 in stage 1.0 (TID 50).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 51
14/12/26 10:13:48 INFO Executor: Running task 1.0 in stage 1.0 (TID 51)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 1 ms
14/12/26 10:13:48 INFO Executor: Finished task 1.0 in stage 1.0 (TID 51).
1058 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 52
14/12/26 10:13:48 INFO Executor: Running task 2.0 in stage 1.0 (TID 52)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 2.0 in stage 1.0 (TID 52).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 53
14/12/26 10:13:48 INFO Executor: Running task 3.0 in stage 1.0 (TID 53)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 1 ms
14/12/26 10:13:48 INFO Executor: Finished task 3.0 in stage 1.0 (TID 53).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 54
14/12/26 10:13:48 INFO Executor: Running task 4.0 in stage 1.0 (TID 54)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 1 ms
14/12/26 10:13:48 INFO Executor: Finished task 4.0 in stage 1.0 (TID 54).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 55
14/12/26 10:13:48 INFO Executor: Running task 5.0 in stage 1.0 (TID 55)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 5.0 in stage 1.0 (TID 55).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 56
14/12/26 10:13:48 INFO Executor: Running task 6.0 in stage 1.0 (TID 56)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 6.0 in stage 1.0 (TID 56).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 57
14/12/26 10:13:48 INFO Executor: Running task 7.0 in stage 1.0 (TID 57)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 7.0 in stage 1.0 (TID 57).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 58
14/12/26 10:13:48 INFO Executor: Running task 8.0 in stage 1.0 (TID 58)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 1 ms
14/12/26 10:13:48 INFO Executor: Finished task 8.0 in stage 1.0 (TID 58).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 59
14/12/26 10:13:48 INFO Executor: Running task 9.0 in stage 1.0 (TID 59)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 1 ms
14/12/26 10:13:48 INFO Executor: Finished task 9.0 in stage 1.0 (TID 59).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 60
14/12/26 10:13:48 INFO Executor: Running task 10.0 in stage 1.0 (TID 60)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 10.0 in stage 1.0 (TID 60).
1063 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 61
14/12/26 10:13:48 INFO Executor: Running task 11.0 in stage 1.0 (TID 61)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 1 ms
14/12/26 10:13:48 INFO Executor: Finished task 11.0 in stage 1.0 (TID 61).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 62
14/12/26 10:13:48 INFO Executor: Running task 12.0 in stage 1.0 (TID 62)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 12.0 in stage 1.0 (TID 62).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 63
14/12/26 10:13:48 INFO Executor: Running task 13.0 in stage 1.0 (TID 63)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 13.0 in stage 1.0 (TID 63).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 64
14/12/26 10:13:48 INFO Executor: Running task 14.0 in stage 1.0 (TID 64)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 1 ms
14/12/26 10:13:48 INFO Executor: Finished task 14.0 in stage 1.0 (TID 64).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 65
14/12/26 10:13:48 INFO Executor: Running task 15.0 in stage 1.0 (TID 65)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 3 ms
14/12/26 10:13:48 INFO Executor: Finished task 15.0 in stage 1.0 (TID 65).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 66
14/12/26 10:13:48 INFO Executor: Running task 16.0 in stage 1.0 (TID 66)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 16.0 in stage 1.0 (TID 66).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 67
14/12/26 10:13:48 INFO Executor: Running task 17.0 in stage 1.0 (TID 67)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 17.0 in stage 1.0 (TID 67).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 68
14/12/26 10:13:48 INFO Executor: Running task 18.0 in stage 1.0 (TID 68)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 0 ms
14/12/26 10:13:48 INFO Executor: Finished task 18.0 in stage 1.0 (TID 68).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 69
14/12/26 10:13:48 INFO Executor: Running task 19.0 in stage 1.0 (TID 69)
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty
blocks out of 50 blocks
14/12/26 10:13:48 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches
in 1 ms
14/12/26 10:13:48 INFO Executor: Finished task 19.0 in stage 1.0 (TID 69).
1037 bytes result sent to driver
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 70
14/12/26 10:13:48 INFO Executor: Running task 0.0 in stage 2.0 (TID 70)
14/12/26 10:13:48 ERROR Executor: Exception in task 0.0 in stage 2.0 (TID
70)
java.lang.NoClassDefFoundError: Lkafka/consumer/ConsumerConnector;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
        at java.lang.Class.getDeclaredField(Class.java:1951)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1706)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at 
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
        at
org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at
org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
kafka.consumer.ConsumerConnector
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 44 more
14/12/26 10:13:48 ERROR SparkUncaughtExceptionHandler: Uncaught exception in
thread Thread[Executor task launch worker-1,5,main]
java.lang.NoClassDefFoundError: Lkafka/consumer/ConsumerConnector;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
        at java.lang.Class.getDeclaredField(Class.java:1951)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1706)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at 
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
        at
org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at
org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
kafka.consumer.ConsumerConnector
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 44 more
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 71
14/12/26 10:13:48 INFO Executor: Running task 0.1 in stage 2.0 (TID 71)
14/12/26 10:13:48 ERROR Executor: Exception in task 0.1 in stage 2.0 (TID
71)
java.lang.NoClassDefFoundError: Lkafka/consumer/ConsumerConnector;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
        at java.lang.Class.getDeclaredField(Class.java:1951)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1706)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at 
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
        at
org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at
org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
kafka.consumer.ConsumerConnector
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 44 more
14/12/26 10:13:48 ERROR SparkUncaughtExceptionHandler: Uncaught exception in
thread Thread[Executor task launch worker-0,5,main]
java.lang.NoClassDefFoundError: Lkafka/consumer/ConsumerConnector;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
        at java.lang.Class.getDeclaredField(Class.java:1951)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1706)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at 
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
        at
org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at
org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
kafka.consumer.ConsumerConnector
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 44 more
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 72
14/12/26 10:13:48 INFO Executor: Running task 0.2 in stage 2.0 (TID 72)
14/12/26 10:13:48 ERROR Executor: Exception in task 0.2 in stage 2.0 (TID
72)
java.lang.NoClassDefFoundError: Lkafka/consumer/ConsumerConnector;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
        at java.lang.Class.getDeclaredField(Class.java:1951)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1706)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at 
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
        at
org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at
org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
kafka.consumer.ConsumerConnector
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 44 more
14/12/26 10:13:48 ERROR SparkUncaughtExceptionHandler: Uncaught exception in
thread Thread[Executor task launch worker-0,5,main]
java.lang.NoClassDefFoundError: Lkafka/consumer/ConsumerConnector;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
        at java.lang.Class.getDeclaredField(Class.java:1951)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1706)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at 
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
        at
org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at
org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
kafka.consumer.ConsumerConnector
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 44 more
14/12/26 10:13:48 INFO CoarseGrainedExecutorBackend: Got assigned task 73
14/12/26 10:13:48 INFO Executor: Running task 0.3 in stage 2.0 (TID 73)
14/12/26 10:13:48 ERROR Executor: Exception in task 0.3 in stage 2.0 (TID
73)
java.lang.NoClassDefFoundError: Lkafka/consumer/ConsumerConnector;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
        at java.lang.Class.getDeclaredField(Class.java:1951)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1706)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at 
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
        at
org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at
org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
kafka.consumer.ConsumerConnector
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 44 more
14/12/26 10:13:48 ERROR SparkUncaughtExceptionHandler: Uncaught exception in
thread Thread[Executor task launch worker-0,5,main]
java.lang.NoClassDefFoundError: Lkafka/consumer/ConsumerConnector;
        at java.lang.Class.getDeclaredFields0(Native Method)
        at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
        at java.lang.Class.getDeclaredField(Class.java:1951)
        at 
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)
        at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
        at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
        at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1706)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1344)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at 
java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:500)
        at
org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
        at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:985)
        at
org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
        at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
kafka.consumer.ConsumerConnector
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 44 more



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Serious-issues-with-class-not-found-exceptions-of-classes-in-uber-jar-tp20863p20864.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to