Cluster hangs in 'ssh-ready' state using Spark 1.2 EC2 launch script
Originally posted here: http://stackoverflow.com/questions/28002443/cluster-hangs-in-ssh-ready-state-using-spark-1-2-ec2-launch-script I'm trying to launch a standalone Spark cluster using its pre-packaged EC2 scripts, but it just indefinitely hangs in an 'ssh-ready' state: ubuntu@machine:~/spark-1.2.0-bin-hadoop2.4$ ./ec2/spark-ec2 -k key-pair -i identity-file.pem -r us-west-2 -s 3 launch test Setting up security groups... Searching for existing cluster test... Spark AMI: ami-ae6e0d9e Launching instances... Launched 3 slaves in us-west-2c, regid = r-b___6 Launched master in us-west-2c, regid = r-0__0 Waiting for all instances in cluster to enter 'ssh-ready' state.. Yet I can SSH into these instances without compaint: ubuntu@machine:~$ ssh -i identity-file.pem root@master-ip Last login: Day MMM DD HH:mm:ss 20YY from c-AA-BBB--DDD.eee1.ff.provider.net __| __|_ ) _| ( / Amazon Linux AMI ___|\___|___| https://aws.amazon.com/amazon-linux-ami/2013.03-release-notes/ There are 59 security update(s) out of 257 total update(s) available Run sudo yum update to apply all updates. Amazon Linux version 2014.09 is available. root@ip-internal ~]$ I'm trying to figure out if this is a problem in AWS or with the Spark scripts. I've never had this issue before until recently. -- Nathan Murthy // 713.884.7110 (mobile) // @natemurthy
Re: Using data in RDD to specify HDFS directory to write to
I'm experiencing the same problem when I try to run my app in a standalone Spark cluster. My use case, however, is closer to the problem documented in this thread: http://apache-spark-user-list.1001560.n3.nabble.com/Please-help-running-a-standalone-app-on-a-Spark-cluster-td1596.html. The solution for which did not work for me. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-data-in-RDD-to-specify-HDFS-directory-to-write-to-tp18789p20526.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org