Hi Austin,

I followed your instructions and gave `rules_jvm_external` a try.

Overall, I think I advanced a bit, but I'm not quite there yet. I have
followed the link [1] given by Matthias, making the necessary changes to my
repo:

https://github.com/salvalcantara/bazel-flink-scala

In particular, the relevant (bazel) BUILD file looks like this:

```
package(default_visibility = ["//visibility:public"])

load("@io_bazel_rules_scala//scala:scala.bzl", "scala_library",
"scala_test")

filegroup(
    name = "scala-main-srcs",
    srcs = glob(["*.scala"]),
)

scala_library(
    name = "flink_app",
    srcs = [":scala-main-srcs"],
    deps = [
        "@maven//:org_apache_flink_flink_core",
        "@maven//:org_apache_flink_flink_clients_2_12",
        "@maven//:org_apache_flink_flink_scala_2_12",
        "@maven//:org_apache_flink_flink_streaming_scala_2_12",
        "@maven//:org_apache_flink_flink_streaming_java_2_12",
    ],
)

java_binary(
    name = "word_count",
    srcs = ["//tools/flink:noop"],
    deploy_env = ["//:default_flink_deploy_env"],
    main_class = "org.example.WordCount",
    deps = [
        ":flink_app",
    ],
)
```

The idea is to use `deploy_env` within `java_binary` for providing the flink
dependencies. This causes those dependencies to get removed from the final
fat jar that one gets by running:

```
bazel build //src/main/scala/org/example:flink_app_deploy.jar
```

The problem now is that the jar still includes the Scala library, which
should also be dropped from the jar as it is part of the provided
dependencies within the Flink cluster. I am reading this blog post in [2]
without luck yet...

Regards,

Salva

[1]
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Does-anyone-have-an-example-of-Bazel-working-with-Flink-td35898.html

[2]
https://yishanhe.net/address-dependency-conflict-for-bazel-built-scala-spark/



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Reply via email to