This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
     new 3ba71e9  [SPARK-27419][FOLLOWUP][DOCS] Add note about 
spark.executor.heartbeatInterval change to migration guide
3ba71e9 is described below

commit 3ba71e9bbf41ef8b5e5e7ff9c8edf95570dc67fe
Author: Sean Owen <sean.o...@databricks.com>
AuthorDate: Mon Apr 22 12:02:16 2019 +0800

    [SPARK-27419][FOLLOWUP][DOCS] Add note about 
spark.executor.heartbeatInterval change to migration guide
    
    Add note about spark.executor.heartbeatInterval change to migration guide
    See also https://github.com/apache/spark/pull/24329
    
    N/A
    
    Closes #24432 from srowen/SPARK-27419.2.
    
    Authored-by: Sean Owen <sean.o...@databricks.com>
    Signed-off-by: Wenchen Fan <wenc...@databricks.com>
    (cherry picked from commit d4a16f46f71021178bfc7dca511e47390986197d)
    Signed-off-by: Wenchen Fan <wenc...@databricks.com>
---
 docs/sql-migration-guide-upgrade.md | 8 ++++++++
 1 file changed, 8 insertions(+)

diff --git a/docs/sql-migration-guide-upgrade.md 
b/docs/sql-migration-guide-upgrade.md
index 8dae1b4..b703cb5 100644
--- a/docs/sql-migration-guide-upgrade.md
+++ b/docs/sql-migration-guide-upgrade.md
@@ -7,6 +7,14 @@ displayTitle: Spark SQL Upgrading Guide
 * Table of contents
 {:toc}
 
+## Upgrading from Spark SQL 2.4 to 2.4.1
+
+  - The value of `spark.executor.heartbeatInterval`, when specified without 
units like "30" rather than "30s", was
+    inconsistently interpreted as both seconds and milliseconds in Spark 2.4.0 
in different parts of the code.
+    Unitless values are now consistently interpreted as milliseconds. 
Applications that set values like "30"
+    need to specify a value with units like "30s" now, to avoid being 
interpreted as milliseconds; otherwise,
+    the extremely short interval that results will likely cause applications 
to fail.
+
 ## Upgrading From Spark SQL 2.3 to 2.4
 
   - In Spark version 2.3 and earlier, the second parameter to array_contains 
function is implicitly promoted to the element type of first array type 
parameter. This type promotion can be lossy and may cause `array_contains` 
function to return wrong result. This problem has been addressed in 2.4 by 
employing a safer type promotion mechanism. This can cause some change in 
behavior and are illustrated in the table below.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to