How bout all dependencies? Presumably they will all go in --jars ?
What if I have 10 dependencies? Any best practices in packaging apps for
spark 2.0?
Kr

On 10 Aug 2016 6:46 pm, "Nick Pentreath" <nick.pentre...@gmail.com> wrote:

> You're correct - Spark packaging has been shifted to not use the assembly
> jar.
>
> To build now use "build/sbt package"
>
>
> On Wed, 10 Aug 2016 at 19:40, Efe Selcuk <efema...@gmail.com> wrote:
>
>> Hi Spark folks,
>>
>> With Spark 1.6 the 'assembly' target for sbt would build a fat jar with
>> all of the main Spark dependencies for building an application. Against
>> Spark 2, that target is no longer building a spark assembly, just ones for
>> e.g. Flume and Kafka.
>>
>> I'm not well versed with maven and sbt, so I don't know how to go about
>> figuring this out.
>>
>> Is this intended? Or am I missing something?
>>
>> Thanks.
>>
>

Reply via email to