IMO sparkR support makes sense to merge for 2.4, as long as the release
wranglers agree that local integration testing is sufficiently convincing.
Part of the intent here is to allow this to happen without Shane having to
reorganize his complex upgrade schedule and make it even more complicated.

On Wed, Aug 15, 2018 at 7:08 PM, Wenchen Fan <cloud0...@gmail.com> wrote:

> I'm also happy to see we have R support on k8s for Spark 2.4. I'll do the
> manual testing for it if we don't want to upgrade the OS now. If the Python
> support is also merged in this way, I think we can merge the R support PR
> too?
>
> On Thu, Aug 16, 2018 at 7:23 AM shane knapp <skn...@berkeley.edu> wrote:
>
>>
>>> What is the current purpose of these builds?
>>>
>>> to be honest, i have absolutely no idea.  :)
>>
>> these were set up a long time ago, in a galaxy far far away, by someone
>> who is not me.
>>
>>
>>> - spark-docs seems to be building the docs, is that the only place
>>> where the docs build is tested?
>>>
>>> i think so...
>>
>>
>>> In the last many releases we've moved away from using jenkins jobs for
>>> preparing the packages, and the scripts have changed a lot to be
>>> friendlier to people running them locally (they even support docker
>>> now, and have flags to run "test" builds that don't require
>>> credentials such as GPG keys).
>>>
>>> awesome++
>>
>>
>>> Perhaps we should think about revamping these jobs instead of keeping
>>> them as is.
>>>
>>
>> i fully support this.  which is exactly why i punted on even trying to
>> get them ported over to the ubuntu nodes.
>>
>> shane
>> --
>> Shane Knapp
>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>> https://rise.cs.berkeley.edu
>>
>

Reply via email to