Nikita ROUSSEAU created SPARK-35954:
---------------------------------------

             Summary: Upgrade Apache Curator Dependency to 4.2.0
                 Key: SPARK-35954
                 URL: https://issues.apache.org/jira/browse/SPARK-35954
             Project: Spark
          Issue Type: Dependency upgrade
          Components: Deploy
    Affects Versions: 3.1.2
         Environment: * OS: Linux
 * JAVA: 1.8.0_292
 * {color:#FF0000}*hadoop-3.3.1*{color}

 
            Reporter: Nikita ROUSSEAU


+Abstract :+ as a Spark Cluster Administrator, I want to connect spark masters 
deployed in HA/mode to Zookeeper over SSL/TLS, so that my network traffic is 
ciphered between y components. 
([https://spark.apache.org/docs/latest/spark-standalone.html#standby-masters-with-zookeeper]
  )

 

With the release of Hadoop 3.3.1, ZKFC libraries and their dependencies were 
updated : it is now possible to connect ZKFC to ZooKeeper over TLS.

 

+Note:+ TLS is possible with Zookeeper Server Version >= 3.5.6 
([https://docs.confluent.io/platform/current/installation/versions-interoperability.html#zk]
 )

 

Spark 3.2.0 aims to support Hadoop 3.3.1 ; this Hadoop release bundles the 
following shared libraries :
 * curator-client-4.2.0.jar
 * curator-framework-4.2.0.jar
 * curator-recipes-4.2.0.jar
 * zookeeper-3.5.6.jar
 * zookeeper-jute-3.5.6.jar

 

Currently, Spark dependency is set to 2.13.0 for the Currator framework 
([https://github.com/apache/spark/blob/master/pom.xml#L127).|https://github.com/apache/spark/blob/master/pom.xml#L127)]

 

It would be great to update "curator-*" dependencies to 4.2.0 in order to be 
compatible with shared jars of the hadoop stack.

Moreover, it will allow administrators to connect Spark Masters to ZooKeeper 
over TLS.

 

Some patches will be required, such as :
 * 
[https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkCuratorUtil.scala#L51]

 

I will try to prepare a MR for this.

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to