Nice work, Dongjoon! Thanks for the huge efforts on sorting out with
correctness things as well.
On Tue, Feb 11, 2020 at 12:40 PM Wenchen Fan wrote:
> Great Job, Dongjoon!
>
> On Mon, Feb 10, 2020 at 4:18 PM Hyukjin Kwon wrote:
>
>> Thanks Dongjoon!
>>
>> 2020년 2월 9일 (일) 오전 10:49, Takeshi
Great Job, Dongjoon!
On Mon, Feb 10, 2020 at 4:18 PM Hyukjin Kwon wrote:
> Thanks Dongjoon!
>
> 2020년 2월 9일 (일) 오전 10:49, Takeshi Yamamuro 님이 작성:
>
>> Happy to hear the release news!
>>
>> Bests,
>> Takeshi
>>
>> On Sun, Feb 9, 2020 at 10:28 AM Dongjoon Hyun
>> wrote:
>>
>>> There was a typo
Thank you, Hyukjin.
The maintenance overhead only occurs when we add a new release.
And, we can prevent accidental upstream changes by avoiding 'latest' tags.
The overhead will be much smaller than our exisitng Dockerfile maintenance
(e.g. 'spark-rm')
Also, if we have a docker repository, we
My team and I are attempting to run Spark Standalone on IPv6-first
infrastructure. This requires that all RPC listeners bind IPv6 sockets e.g.
`:::7077` instead of `127.0.0.1:7077`. Initial experimentation has found
that Spark 2.4.4 doesn't currently handle this scenario. Various host/bind
Quick question. Roughly how much overhead is it required to maintain
minimal version?
If that looks not too much, I think it's fine to give a shot.
2020년 2월 8일 (토) 오전 6:51, Dongjoon Hyun 님이 작성:
> Thank you, Sean, Jiaxin, Shane, and Tom, for feedbacks.
>
> 1. For legal questions, please see the
Thanks Dongjoon!
2020년 2월 9일 (일) 오전 10:49, Takeshi Yamamuro 님이 작성:
> Happy to hear the release news!
>
> Bests,
> Takeshi
>
> On Sun, Feb 9, 2020 at 10:28 AM Dongjoon Hyun
> wrote:
>
>> There was a typo in one URL. The correct release note URL is here.
>>
>>
FWIW, I believe all tests are fixed in PySpark and SparkR with JDK 11. Let
me know if you guys meet some test failures.
2020년 2월 1일 (토) 오전 10:45, Dongjoon Hyun 님이 작성:
> Oops. I found this flaky test fails event in `Hadoop 2.7 with Hive 1.2`.
>
>
>