Hi!

Looks like that experience should be improved.

Do you know why you are getting conflicts on the FashHashMap class, even
though the core Flink dependencies are "provided"? Does adding the Kafka
connector pull in all the core Flink dependencies?

Concerning the Kafka connector: We did not include the connectors in the
distribution, because that would overload the distribution with a huge
number of dependencies from the connectors

Greetings,
Stephan


On Fri, Feb 12, 2016 at 10:44 PM, shikhar <shik...@schmizz.net> wrote:

> Repro at https://github.com/shikhar/flink-sbt-fatjar-troubles, run `sbt
> assembly`
>
> A fat jar seems like the best way to provide jobs for Flink to execute.
>
> I am declaring deps like:
> {noformat}
> "org.apache.flink" %% "flink-clients" % "1.0-SNAPSHOT" % "provided"
> "org.apache.flink" %% "flink-streaming-scala" % "1.0-SNAPSHOT" % "provided"
> "org.apache.flink" %% "flink-connector-kafka-0.8" % "1.0-SNAPSHOT"
> {noformat}
>
> Connectors aren't included in the distribution so can't mark the Kafka
> connector as 'provided'.
>
> Using sbt-assembly plugin and running the 'assembly' task, I get lots of
> failures because:
>
> ```
> [error] deduplicate: different file contents found in the following:
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-connector-kafka-0.8_2.11/jars/flink-connector-kafka-0.8_2.11-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-connector-kafka-base_2.11/jars/flink-connector-kafka-base_2.11-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-streaming-java_2.11/jars/flink-streaming-java_2.11-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-core/jars/flink-core-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-shaded-hadoop2/jars/flink-shaded-hadoop2-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-runtime_2.11/jars/flink-runtime_2.11-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-java/jars/flink-java-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-clients_2.11/jars/flink-clients_2.11-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> [error]
>
> /Users/shikhar/.ivy2/cache/org.apache.flink/flink-optimizer_2.11/jars/flink-optimizer_2.11-1.0-SNAPSHOT.jar:org/apache/flink/shaded/com/google/common/util/concurrent/package-info.class
> ```
>
> I tried declaring a MergeStrategy as per
>
> https://github.com/shikhar/flink-sbt-fatjar-troubles/blob/master/build.sbt#L13-L18
> ,
> which helps with the shading conflicts, but then I get lots of errors from
> conflicts in `commons-collections` vs `commons-beanutils` vs
> `commons-beanutils-core`, which are deps pulled in via Flink:
>
> ```
> [error] deduplicate: different file contents found in the following:
> [error]
>
> /Users/shikhar/.ivy2/cache/commons-collections/commons-collections/jars/commons-collections-3.2.2.jar:org/apache/commons/collections/FastHashMap.class
> [error]
>
> /Users/shikhar/.ivy2/cache/commons-beanutils/commons-beanutils/jars/commons-beanutils-1.7.0.jar:org/apache/commons/collections/FastHashMap.class
> [error]
>
> /Users/shikhar/.ivy2/cache/commons-beanutils/commons-beanutils-core/jars/commons-beanutils-core-1.8.0.jar:org/apache/commons/collections/FastHashMap.class
> ```
>
> The best way I have found to work around this for now is also mark the
> flink-kafka connector as a 'provided' dependency and customize flink-dist
> to
> include it :( I'd really rather not create a custom distribution.
>
>
>
> --
> View this message in context:
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-packaging-makes-life-hard-for-SBT-fat-jar-s-tp4897.html
> Sent from the Apache Flink User Mailing List archive. mailing list archive
> at Nabble.com.
>

Reply via email to