Hi Spark folks,

With Spark 1.6 the 'assembly' target for sbt would build a fat jar with all
of the main Spark dependencies for building an application. Against Spark
2, that target is no longer building a spark assembly, just ones for e.g.
Flume and Kafka.

I'm not well versed with maven and sbt, so I don't know how to go about
figuring this out.

Is this intended? Or am I missing something?

Thanks.

Reply via email to