It was only hanging when I specified the path with ~ I never tried relative.

Hanging on the waiting for ssh to be ready on all hosts. I let it sit for
about 10 minutes then I found the StackOverflow answer that suggested
specifying an absolute path, cancelled, and re-run with --resume and the
absolute path and all slaves were up in a couple minutes.

(I've stood up 4 integration clusters and 2 production clusters on EC2
since with no problems.)

On Wed Jan 28 2015 at 12:05:43 PM Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> Ey-chih,
>
> That makes more sense. This is a known issue that will be fixed as part of
> SPARK-5242 <https://issues.apache.org/jira/browse/SPARK-5242>.
>
> Charles,
>
> Thanks for the info. In your case, when does spark-ec2 hang? Only when the
> specified path to the identity file doesn't exist? Or also when you specify
> the path as a relative path or with ~?
>
> Nick
>
>
> On Wed Jan 28 2015 at 9:29:34 AM ey-chih chow <eyc...@hotmail.com> wrote:
>
>> We found the problem and already fixed it.  Basically, spark-ec2 requires
>> ec2 instances to have external ip addresses. You need to specify this in
>> the ASW console.
>> ------------------------------
>> From: nicholas.cham...@gmail.com
>> Date: Tue, 27 Jan 2015 17:19:21 +0000
>> Subject: Re: spark 1.2 ec2 launch script hang
>> To: charles.fed...@gmail.com; pzybr...@gmail.com; eyc...@hotmail.com
>> CC: user@spark.apache.org
>>
>>
>> For those who found that absolute vs. relative path for the pem file
>> mattered, what OS and shell are you using? What version of Spark are you
>> using?
>>
>> ~/ vs. absolute path shouldn’t matter. Your shell will expand the ~/ to
>> the absolute path before sending it to spark-ec2. (i.e. tilde expansion.)
>>
>> Absolute vs. relative path (e.g. ../../path/to/pem) also shouldn’t
>> matter, since we fixed that for Spark 1.2.0
>> <https://issues.apache.org/jira/browse/SPARK-4137>. Maybe there’s some
>> case that we missed?
>>
>> Nick
>>
>> On Tue Jan 27 2015 at 10:10:29 AM Charles Feduke <
>> charles.fed...@gmail.com> wrote:
>>
>>
>> Absolute path means no ~ and also verify that you have the path to the
>> file correct. For some reason the Python code does not validate that the
>> file exists and will hang (this is the same reason why ~ hangs).
>> On Mon, Jan 26, 2015 at 10:08 PM Pete Zybrick <pzybr...@gmail.com> wrote:
>>
>> Try using an absolute path to the pem file
>>
>>
>>
>> > On Jan 26, 2015, at 8:57 PM, ey-chih chow <eyc...@hotmail.com> wrote:
>> >
>> > Hi,
>> >
>> > I used the spark-ec2 script of spark 1.2 to launch a cluster.  I have
>> > modified the script according to
>> >
>> > https://github.com/grzegorz-dubicki/spark/commit/5dd8458d2ab
>> 9753aae939b3bb33be953e2c13a70
>> >
>> > But the script was still hung at the following message:
>> >
>> > Waiting for cluster to enter 'ssh-ready'
>> > state.............................................
>> >
>> > Any additional thing I should do to make it succeed?  Thanks.
>> >
>> >
>> > Ey-Chih Chow
>> >
>> >
>> >
>> > --
>> > View this message in context: http://apache-spark-user-list.
>> 1001560.n3.nabble.com/spark-1-2-ec2-launch-script-hang-tp21381.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: user-h...@spark.apache.org
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>> ​
>>
>

Reply via email to