Hi,

Thanks much for the answers. Learning Spark every day!

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
The Internals of Spark SQL https://bit.ly/spark-sql-internals
The Internals of Spark Structured Streaming
https://bit.ly/spark-structured-streaming
The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
Follow me at https://twitter.com/jaceklaskowski



On Wed, Sep 4, 2019 at 3:15 PM Sean Owen <sro...@gmail.com> wrote:

> Yes that's right. I don't think Spark's usage of ZK needs any ZK
> server, so it's safe to exclude in Spark (at least, so far so good!)
>
> On Wed, Sep 4, 2019 at 8:06 AM Steve Loughran
> <ste...@cloudera.com.invalid> wrote:
> >
> > Zookeeper client is/was netty 3, AFAIK, so if you want to use it for
> anything, it ends up on the CP
> >
> > On Tue, Sep 3, 2019 at 5:18 PM Shixiong(Ryan) Zhu <
> shixi...@databricks.com> wrote:
> >>
> >> Yep, historical reasons. And Netty 4 is under another namespace, so we
> can use Netty 3 and Netty 4 in the same JVM.
> >>
> >> On Tue, Sep 3, 2019 at 6:15 AM Sean Owen <sro...@gmail.com> wrote:
> >>>
> >>> It was for historical reasons; some other transitive dependencies
> needed it.
> >>> I actually was just able to exclude Netty 3 last week from master.
> >>> Spark uses Netty 4.
> >>>
> >>> On Tue, Sep 3, 2019 at 6:59 AM Jacek Laskowski <ja...@japila.pl>
> wrote:
> >>> >
> >>> > Hi,
> >>> >
> >>> > Just noticed that Spark 2.4.x uses two netty deps of different
> versions. Why?
> >>> >
> >>> > jars/netty-all-4.1.17.Final.jar
> >>> > jars/netty-3.9.9.Final.jar
> >>> >
> >>> > Shouldn't one be excluded or perhaps shaded?
> >>> >
> >>> > Pozdrawiam,
> >>> > Jacek Laskowski
> >>> > ----
> >>> > https://about.me/JacekLaskowski
> >>> > The Internals of Spark SQL https://bit.ly/spark-sql-internals
> >>> > The Internals of Spark Structured Streaming
> https://bit.ly/spark-structured-streaming
> >>> > The Internals of Apache Kafka https://bit.ly/apache-kafka-internals
> >>> > Follow me at https://twitter.com/jaceklaskowski
> >>> >
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>>
> >> --
> >>
> >> Best Regards,
> >>
> >> Ryan
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to