+1 from me. There is little change from 2.4.2 anyway, except for the
important change to the build script that should build pyspark with
Scala 2.11 jars. I verified that the package contains the _2.11 Spark
jars, but have a look!
I'm still getting this weird error from the Kafka module when
It turned out that I was unintentionally copying multiple copies of the Hadoop
config to every partition in an rdd. >.< I was able to debug this by setting a
break point on the warning message and inspecting the partition object itself.
Cheers Andrew
From: Russell Spitzer
Date: Thursday,
+1 (non-binding)
> 在 2019年5月1日,上午10:16,Michael Heuer 写道:
>
> +1 (non-binding)
+1 (non-binding)
The binary release files are correctly built with Scala 2.11.12.
Thank you,
michael
> On May 1, 2019, at 9:39 AM, Xiao Li wrote:
>
> Please vote on releasing the following candidate as Apache Spark version
> 2.4.3.
>
> The vote is open until May 5th PST and passes if a
Just my 2c
If there is a known security issue, we should fix it rather waiting for if it
actually could be might be affecting Spark to be found by a black hat, or worse.
I don’t think any of us want to see Spark in the news for this reason.
From: Sean Owen
Please vote on releasing the following candidate as Apache Spark version
2.4.3.
The vote is open until May 5th PST and passes if a majority +1 PMC votes
are cast, with
a minimum of 3 +1 votes.
[ ] +1 Release this package as Apache Spark 2.4.3
[ ] -1 Do not release this package because ...
To