Try to run the spark-shell in standalone mode (MASTER=spark://yourmasterurl:7077 $SPARK_HOME/bin/spark-shell), and do a small count ( val d = sc.parallelize(1 to 1000).count()), If that is failing, then something is wrong with your cluster setup as its saying Connection refused: node001/10.180.49.228:
freedafeng wrote > The worker side has error message as this, > > 14/10/30 18:29:00 INFO Worker: Asked to launch executor > app-20141030182900-0006/0 for testspark_v1 > 14/10/30 18:29:01 INFO ExecutorRunner: Launch command: "java" "-cp" > "::/root/spark-1.1.0/conf:/root/spark-1.1.0/assembly/target/scala-2.10/spark-assembly-1.1.0-hadoop2.3.0.jar" > "-XX:MaxPermSize=128m" "-Dspark.driver.port=52552" "-Xms512M" "-Xmx512M" > "org.apache.spark.executor.CoarseGrainedExecutorBackend" > "akka.tcp://sparkDriver@master:52552/user/CoarseGrainedScheduler" "0" > "node001" "4" "akka.tcp://sparkWorker@node001:60184/user/Worker" > "app-20141030182900-0006" > 14/10/30 18:29:03 INFO Worker: Asked to kill executor > app-20141030182900-0006/0 > 14/10/30 18:29:03 INFO ExecutorRunner: Runner thread for executor > app-20141030182900-0006/0 interrupted > 14/10/30 18:29:03 INFO ExecutorRunner: Killing process! > 14/10/30 18:29:03 ERROR FileAppender: Error writing stream to file > /root/spark-1.1.0/work/app-20141030182900-0006/0/stderr > java.io.IOException: Stream Closed > at java.io.FileInputStream.readBytes(Native Method) > at java.io.FileInputStream.read(FileInputStream.java:214) > at > org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70) > at > org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39) > at > org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39) > at > org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39) > at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1311) > at > org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38) > 14/10/30 18:29:04 INFO Worker: Executor app-20141030182900-0006/0 finished > with state KILLED exitStatus 143 > 14/10/30 18:29:04 INFO LocalActorRef: Message > [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from > Actor[akka://sparkWorker/deadLetters] to > Actor[akka://sparkWorker/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkWorker%4010.180.49.228%3A52120-22#1336571562] > was not delivered. [6] dead letters encountered. This logging can be > turned off or adjusted with configuration settings 'akka.log-dead-letters' > and 'akka.log-dead-letters-during-shutdown'. > 14/10/30 18:29:04 ERROR EndpointWriter: AssociationError > [akka.tcp://sparkWorker@node001:60184] -> > [akka.tcp://sparkExecutor@node001:37697]: Error [Association failed with > [akka.tcp://sparkExecutor@node001:37697]] [ > akka.remote.EndpointAssociationException: Association failed with > [akka.tcp://sparkExecutor@node001:37697] > Caused by: > akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: > Connection refused: node001/10.180.49.228:37697 > ] > 14/10/30 18:29:04 ERROR EndpointWriter: AssociationError > [akka.tcp://sparkWorker@node001:60184] -> > [akka.tcp://sparkExecutor@node001:37697]: Error [Association failed with > [akka.tcp://sparkExecutor@node001:37697]] [ > akka.remote.EndpointAssociationException: Association failed with > [akka.tcp://sparkExecutor@node001:37697] > Caused by: > akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: > Connection refused: node001/10.180.49.228:37697 > ] > 14/10/30 18:29:04 ERROR EndpointWriter: AssociationError > [akka.tcp://sparkWorker@node001:60184] -> > [akka.tcp://sparkExecutor@node001:37697]: Error [Association failed with > [akka.tcp://sparkExecutor@node001:37697]] [ > akka.remote.EndpointAssociationException: Association failed with > [akka.tcp://sparkExecutor@node001:37697] > Caused by: > akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: > Connection refused: node001/10.180.49.228:37697 > ] > > Thanks! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/stage-failure-java-lang-IllegalStateException-unread-block-data-tp17751p20685.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org