Yes, it does not work manually. I am not able to really do 'yum search' to
find exact package names to
try others, but I tried python-pip and it gave same error.

I will post this in the link you pointed out. Thanks!

On Thu, Apr 14, 2016 at 6:11 PM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> If you log into the cluster and manually try that step does it still fail?
> Can you yum install anything else?
>
> You might want to report this issue directly on the spark-ec2 repo, btw:
> https://github.com/amplab/spark-ec2
>
> Nick
>
> On Thu, Apr 14, 2016 at 9:08 PM sanusha <anusha.s.peru...@gmail.com>
> wrote:
>
>>
>> I am using spark-1.6.1-prebuilt-with-hadoop-2.6 on mac. I am using the
>> spark-ec2 to launch a cluster in
>> Amazon VPC. The setup.sh script [run first thing on master after launch]
>> uses pssh and tries to install it
>> via 'yum install -y pssh'. This step always fails on the master AMI that
>> the
>> script uses by default as it is
>> not able to find it in the repo mirrors - hits 403.
>>
>> Has anyone faced this and know what's causing it? For now, I have changed
>> the script to not use pssh
>> as a workaround. But would like to fix the root cause.
>>
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-ec2-hitting-yum-install-issues-tp26786.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>


-- 
Anusha

Reply via email to