Hello,
I tried to execute a simple spark application using sparkSQL.

 

At
first try, it worked as I exepcted but after then, it doesn't run and shows an
stderr like below:

 

 

Spark
Executor Command: "java" "-cp"
"::/opt/spark-1.0.2-bin-hadoop2/conf:/opt/spark-1.0.2-bin-hadoop2/lib/spark-assembly-1.0.2-hadoop2.4.0.jar:/opt/hadoop2/etc/hadoop:/opt/hadoop2/etc/hadoop"
"-XX:MaxPermSize=128m" "-Xms14336M" "-Xmx14336M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka.tcp://spark@saturn00:35894/user/CoarseGrainedScheduler"
"9" "saturn09" "4"
"akka.tcp://sparkWorker@saturn09:45636/user/Worker"
"app-20140908223656-0000"

========================================

 

14/09/08
22:36:57 INFO spark.SecurityManager: Changing view acls to: root

14/09/08
22:36:57 INFO spark.SecurityManager: SecurityManager: authentication disabled;
ui acls disabled; users with view permissions: Set(root)

14/09/08
22:36:57 INFO slf4j.Slf4jLogger: Slf4jLogger started

14/09/08
22:36:57 INFO Remoting: Starting remoting

14/09/08
22:36:57 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkExecutor@saturn09:44260]

14/09/08
22:36:57 INFO Remoting: Remoting now listens on addresses: 
[akka.tcp://sparkExecutor@saturn09:44260]

14/09/08
22:36:57 INFO executor.CoarseGrainedExecutorBackend: Connecting to driver:
akka.tcp://spark@saturn00:35894/user/CoarseGrainedScheduler

14/09/08
22:36:57 INFO worker.WorkerWatcher: Connecting to worker 
akka.tcp://sparkWorker@saturn09:45636/user/Worker

14/09/08
22:36:57 INFO worker.WorkerWatcher: Successfully connected to
akka.tcp://sparkWorker@saturn09:45636/user/Wo rker

14/09/08
22:36:57 INFO executor.CoarseGrainedExecutorBackend: Successfully registered
with driver

14/09/08
22:36:57 INFO spark.SecurityManager: Changing view acls to: root

14/09/08
22:36:57 INFO spark.SecurityManager: SecurityManager: authentication disabled;
ui acls disabled; users with view permissions: Set(root)

14/09/08
22:36:58 INFO slf4j.Slf4jLogger: Slf4jLogger started

14/09/08
22:36:58 INFO Remoting: Starting remoting

14/09/08
22:36:58 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://spark@saturn09:39880]

14/09/08
22:36:58 INFO Remoting: Remoting now listens on addresses:
[akka.tcp://spark@saturn09:39880]

14/09/08
22:36:58 INFO spark.SparkEnv: Connecting to MapOutputTracker:
akka.tcp://spark@saturn00:35894/user/MapOutputTracker

14/09/08
22:36:58 INFO spark.SparkEnv: Connecting to BlockManagerMaster:
akka.tcp://spark@saturn00:35894/user/BlockManagerMaster

14/09/08
22:36:58 INFO storage.DiskBlockManager: Created local directory at
/hadoop/spark/spark-local-20140908223658-5699

14/09/08 22:36:58 INFO storage.MemoryStore:
MemoryStore started with capacity 4.0 GB.

14/09/08 22:36:58 INFO
network.ConnectionManager: Bound socket to port 49090 with id =
ConnectionManagerId(saturn09,49090)

14/09/08 22:36:58 INFO
storage.BlockManagerMaster: Trying to register BlockManager

14/09/08 22:36:58 INFO
storage.BlockManagerMaster: Registered BlockManager

14/09/08 22:36:58 INFO
spark.HttpFileServer: HTTP File server directory is
/tmp/spark-379704ff-05f2-4c93-8814-ffbe1cc8cd53

14/09/08 22:36:58 INFO spark.HttpServer:
Starting HTTP Server

14/09/08 22:36:58 INFO server.Server:
jetty-8.y.z-SNAPSHOT

14/09/08 22:36:58 INFO
server.AbstractConnector: Started <a 
href="mailto:SocketConnector@0.0.0.0:40257";>SocketConnector@0.0.0.0:40257

 [akka.tcp://spark@saturn00:35894]
disassociated! Shutting down.

 

 

Here, saturn00 is a master and there are 10 nodes in my
cluster (saturn01~saturn10)

 

At the last message of the error, what is the meaning of
"Driver Disassociated?"

 

How can I resolve this issue?

 

Thanks

 








 
  
  // Yoonmin Nam
  
 


  


// Yoonmin Nam


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to