[ 
https://issues.apache.org/jira/browse/SPARK-6206?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14357260#comment-14357260
 ] 

Joe O commented on SPARK-6206:
------------------------------

Ok close this one out.

The problem was local configuration. 

I had installed the Google Cloud Services tools, which created a ~/.boto file.

The boto library that packaged with Spark picked up on this and started using 
the settings in the file, which conflicted with what I was telling Spark to use.

Renaming the ~/.boto file temporarily cause the spark-ec2 script to start 
working again.

Documenting everything here in case someone else runs into this problem.

> spark-ec2 script reporting SSL error?
> -------------------------------------
>
>                 Key: SPARK-6206
>                 URL: https://issues.apache.org/jira/browse/SPARK-6206
>             Project: Spark
>          Issue Type: Bug
>          Components: EC2
>    Affects Versions: 1.2.0
>            Reporter: Joe O
>
> I have been using the spark-ec2 script for several months with no problems.
> Recently, when executing a script to launch a cluster I got the following 
> error:
> {code}
> [Errno 185090050] _ssl.c:344: error:0B084002:x509 certificate 
> routines:X509_load_cert_crl_file:system lib
> {code}
> Nothing launches, the script exits.
> I am not sure if something on machine changed, this is a problem with EC2's 
> certs, or a problem with Python. 
> It occurs 100% of the time, and has been occurring over at least the last two 
> days. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to