mhh i would not be very happy if the implication is that i have to start
maintaining separate spark builds for client clusters that use java 8...

On Mon, Jun 6, 2016 at 4:34 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> Please see:
> https://spark.apache.org/docs/latest/security.html
>
> w.r.t. Java 8, probably you need to rebuild 1.5.2 using Java 8.
>
> Cheers
>
> On Mon, Jun 6, 2016 at 1:19 PM, verylucky...@gmail.com <
> verylucky...@gmail.com> wrote:
>
>> Thank you for your response.
>>
>> I have seen this and couple of other similar ones about java ssl in
>> general. However, I am not sure how it applies to Spark and specifically to
>> my case.
>>
>> This error I mention above occurs when I switch from java 7 to java 8 by
>> changing the env variable JAVA_HOME.
>>
>> The error occurs seems to occur at the time of starting Jetty HTTPServer.
>>
>> Can you please point me to resources that help me understand how security
>> is managed in Spark and how changing from java 7 to 8 can mess up these
>> configurations?
>>
>>
>> Thank you!
>>
>> On Mon, Jun 6, 2016 at 2:37 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> Have you seen this ?
>>>
>>>
>>> http://stackoverflow.com/questions/22423063/java-exception-on-sslsocket-creation
>>>
>>> On Mon, Jun 6, 2016 at 12:31 PM, verylucky Man <verylucky...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> I have a cluster (Hortonworks supported system) running Apache spark on
>>>> 1.5.2 on Java 7, installed by admin. Java 8 is also installed.
>>>>
>>>> I don't have admin access to this cluster and would like to run spark
>>>> (1.5.2 and later versions) on java 8.
>>>>
>>>> I come from HPC/MPI background. So I naively copied all executables of
>>>> spark "/usr/hdp/current/spark-client/" into my root folder.
>>>>
>>>> When I run spark-shell from my copied folder, it runs as expected on
>>>> java 7.
>>>>
>>>> When I change $JAVA_HOME to point to java 8, and run spark-shell, I get
>>>> the following error.
>>>>
>>>> Could you please help me fix this error?
>>>>
>>>> Exception in thread "main" java.security.NoSuchAlgorithmException:
>>>> Error constructing implementation (algorithm: Default, provider:
>>>> SunJSSE, class: sun.security.ssl.SSLContextImpl$DefaultSSLContext) at
>>>> java.security.Provider$Service.newInstance(Provider.java:1617) at sun.
>>>> security.jca.GetInstance.getInstance(GetInstance.java:236) at sun.
>>>> security.jca.GetInstance.getInstance(GetInstance.java:164) at javax.net
>>>> .ssl.SSLContext.getInstance(SSLContext.java:156) at javax.net.ssl.
>>>> SSLContext.getDefault(SSLContext.java:96) at org.apache.spark.
>>>> SSLOptions.liftedTree1$1(SSLOptions.scala:122) at org.apache.spark.
>>>> SSLOptions.<init>(SSLOptions.scala:114) at org.apache.spark.SSLOptions$
>>>> .parse(SSLOptions.scala:199) at org.apache.spark.SecurityManager.<init
>>>> >(SecurityManager.scala:243) at org.apache.spark.repl.SparkIMain.<init
>>>> >(SparkIMain.scala:118) at org.apache.spark.repl.
>>>> SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:187) at org.
>>>> apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:217)
>>>> at org.apache.spark.repl.
>>>> SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.
>>>> apply$mcZ$sp(SparkILoop.scala:949) at org.apache.spark.repl.
>>>> SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(
>>>> SparkILoop.scala:945) at org.apache.spark.repl.
>>>> SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(
>>>> SparkILoop.scala:945) at scala.tools.nsc.util.ScalaClassLoader$.
>>>> savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.
>>>> repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.
>>>> scala:945) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala
>>>> :1059) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache
>>>> .spark.repl.Main.main(Main.scala) at sun.reflect.
>>>> NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.
>>>> NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>>>> DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.
>>>> invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.
>>>> org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:685) at
>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at
>>>> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by:
>>>> java.io.EOFException at java.io.DataInputStream.readInt(DataInputStream
>>>> .java:392) at sun.security.provider.JavaKeyStore.engineLoad(
>>>> JavaKeyStore.java:653) at sun.security.provider.JavaKeyStore$JKS.
>>>> engineLoad(JavaKeyStore.java:56) at sun.security.provider.
>>>> KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:225) at sun.
>>>> security.provider.JavaKeyStore$DualFormatJKS.engineLoad(JavaKeyStore.
>>>> java:70) at java.security.KeyStore.load(KeyStore.java:1445) at sun.
>>>> security.ssl.TrustManagerFactoryImpl.getCacertsKeyStore(
>>>> TrustManagerFactoryImpl.java:226) at sun.security.ssl.
>>>> SSLContextImpl$DefaultSSLContext.getDefaultTrustManager(SSLContextImpl.
>>>> java:767) at sun.security.ssl.SSLContextImpl$DefaultSSLContext.<init>(
>>>> SSLContextImpl.java:733) at sun.reflect.NativeConstructorAccessorImpl.
>>>> newInstance0(Native Method) at sun.reflect.
>>>> NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl
>>>> .java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
>>>> DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.
>>>> Constructor.newInstance(Constructor.java:422) at java.security.
>>>> Provider$Service.newInstance(Provider.java:1595) ... 28 more
>>>>
>>>>
>>>>
>>>
>>
>

Reply via email to