This is an automated email from the ASF dual-hosted git repository. srowen pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/spark-website.git
The following commit(s) were added to refs/heads/asf-site by this push: new 09acf60 Retroactively add some JIRAs marked 'release-notes' to 2.4.1 release announcement 09acf60 is described below commit 09acf602bbde857fb666dcdb63de080d2787eb0d Author: Sean Owen <sean.o...@databricks.com> AuthorDate: Wed Apr 24 07:51:44 2019 -0700 Retroactively add some JIRAs marked 'release-notes' to 2.4.1 release announcement Author: Sean Owen <sean.o...@databricks.com> Closes #196 from srowen/241ReleaseNotes. --- releases/_posts/2019-03-31-spark-release-2-4-1.md | 16 ++++++++++++- site/releases/spark-release-2-4-1.html | 28 ++++++++++++++++++++++- 2 files changed, 42 insertions(+), 2 deletions(-) diff --git a/releases/_posts/2019-03-31-spark-release-2-4-1.md b/releases/_posts/2019-03-31-spark-release-2-4-1.md index da4a015..d3a5112 100644 --- a/releases/_posts/2019-03-31-spark-release-2-4-1.md +++ b/releases/_posts/2019-03-31-spark-release-2-4-1.md @@ -17,9 +17,23 @@ In Apache Spark 2.4.1, Scala 2.12 support is GA, and it's no longer experimental You can consult JIRA for the [detailed changes](https://s.apache.org/spark-2.4.1). +### Core and Spark SQL + + - **Performance and stability** + - [[SPARK-26266]](https://issues.apache.org/jira/browse/SPARK-26266) Update to Scala 2.12.8 (requires recent Java 8 versions) + - [[SPARK-26188]](https://issues.apache.org/jira/browse/SPARK-26188) Spark 2.4.0 Partitioning behavior breaks backwards compatibility + + - **Other notable changes** + - [[SPARK-27198]](https://issues.apache.org/jira/browse/SPARK-27198) Heartbeat interval mismatch in driver and executor + +### Windows + + - **Performance and stability** + - [[SPARK-26080]](https://issues.apache.org/jira/browse/SPARK-26080) Unable to run worker.py on Windows + ### Known issue - **CORE** - - SPARK-27198: if `spark.executor.heartbeatInterval` is less than one second, it will always be set to zero resulting timeout. + - [[SPARK-27419]](https://issues.apache.org/jira/browse/SPARK-27419): if `spark.executor.heartbeatInterval` is less than one second, it will always be set to zero resulting timeout. We would like to acknowledge all community members for contributing patches to this release. diff --git a/site/releases/spark-release-2-4-1.html b/site/releases/spark-release-2-4-1.html index ab89dd1..d281831 100644 --- a/site/releases/spark-release-2-4-1.html +++ b/site/releases/spark-release-2-4-1.html @@ -209,12 +209,38 @@ <p>You can consult JIRA for the <a href="https://s.apache.org/spark-2.4.1">detailed changes</a>.</p> +<h3 id="core-and-spark-sql">Core and Spark SQL</h3> + +<ul> + <li><strong>Performance and stability</strong> + <ul> + <li><a href="https://issues.apache.org/jira/browse/SPARK-26266">[SPARK-26266]</a> Update to Scala 2.12.8 (requires recent Java 8 versions)</li> + <li><a href="https://issues.apache.org/jira/browse/SPARK-26188">[SPARK-26188]</a> Spark 2.4.0 Partitioning behavior breaks backwards compatibility</li> + </ul> + </li> + <li><strong>Other notable changes</strong> + <ul> + <li><a href="https://issues.apache.org/jira/browse/SPARK-27198">[SPARK-27198]</a> Heartbeat interval mismatch in driver and executor</li> + </ul> + </li> +</ul> + +<h3 id="windows">Windows</h3> + +<ul> + <li><strong>Performance and stability</strong> + <ul> + <li><a href="https://issues.apache.org/jira/browse/SPARK-26080">[SPARK-26080]</a> Unable to run worker.py on Windows</li> + </ul> + </li> +</ul> + <h3 id="known-issue">Known issue</h3> <ul> <li><strong>CORE</strong> <ul> - <li>SPARK-27198: if <code>spark.executor.heartbeatInterval</code> is less than one second, it will always be set to zero resulting timeout.</li> + <li><a href="https://issues.apache.org/jira/browse/SPARK-27419">[SPARK-27419]</a>: if <code>spark.executor.heartbeatInterval</code> is less than one second, it will always be set to zero resulting timeout.</li> </ul> </li> </ul> --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org