Hi Austin, In the end I added the following target override for Scala:
``` maven_install( artifacts = [ # testing maven.artifact( group = "com.google.truth", artifact = "truth", version = "1.0.1", ), ] + flink_artifacts( addons = FLINK_ADDONS, scala_version = FLINK_SCALA_VERSION, version = FLINK_VERSION, ) + flink_testing_artifacts( scala_version = FLINK_SCALA_VERSION, version = FLINK_VERSION, ), fetch_sources = True, # This override results in Scala-related classes being removed from the deploy jar as required (?) override_targets = { "org.scala-lang.scala-library": "@io_bazel_rules_scala_scala_library//:io_bazel_rules_scala_scala_library", "org.scala-lang.scala-reflect": "@io_bazel_rules_scala_scala_reflect//:io_bazel_rules_scala_scala_reflect", "org.scala-lang.scala-compiler": "@io_bazel_rules_scala_scala_compiler//:io_bazel_rules_scala_scala_compiler", "org.scala-lang.modules.scala-parser-combinators_%s" % FLINK_SCALA_VERSION: "@io_bazel_rules_scala_scala_parser_combinators//:io_bazel_rules_scala_scala_parser_combinators", "org.scala-lang.modules.scala-xml_%s" % FLINK_SCALA_VERSION: "@io_bazel_rules_scala_scala_xml//:io_bazel_rules_scala_scala_xml", }, repositories = MAVEN_REPOSITORIES, ) ``` and now it works as expected, meaning: ``` bazel build //src/main/scala/org/example:word_count_deploy.jar ``` produces a jar with both Flink and Scala-related classes removed (since they are provided by the runtime). I did a quick check and the flink job runs just fine in a local cluster. It would be nice if the community could confirm that this is indeed the way to build flink-based scala applications... BTW I updated the repo with the abovementioned override: https://github.com/salvalcantara/bazel-flink-scala in case you want to give it a try -- Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/