I just flailed on this a bit before finding this email.  Can someone please
update
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IDESetup

On Mon, Apr 4, 2016 at 10:01 PM, Reynold Xin <[email protected]> wrote:

> pyspark and R
>
> On Mon, Apr 4, 2016 at 9:59 PM, Marcelo Vanzin <[email protected]>
> wrote:
>
>> No, tests (except pyspark) should work without having to package anything
>> first.
>>
>> On Mon, Apr 4, 2016 at 9:58 PM, Koert Kuipers <[email protected]> wrote:
>> > do i need to run sbt package before doing tests?
>> >
>> > On Mon, Apr 4, 2016 at 11:00 PM, Marcelo Vanzin <[email protected]>
>> wrote:
>> >>
>> >> Hey all,
>> >>
>> >> We merged  SPARK-13579 today, and if you're like me and have your
>> >> hands automatically type "sbt assembly" anytime you're building Spark,
>> >> that won't work anymore.
>> >>
>> >> You should now use "sbt package"; you'll still need "sbt assembly" if
>> >> you require one of the remaining assemblies (streaming connectors,
>> >> yarn shuffle service).
>> >>
>> >>
>> >> --
>> >> Marcelo
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: [email protected]
>> >> For additional commands, e-mail: [email protected]
>> >>
>> >
>>
>>
>>
>> --
>> Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
>>
>


-- 
Michael Gummelt
Software Engineer
Mesosphere

Reply via email to