[spark] branch branch-2.4 updated: [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before real start

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new e1e94ed  [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task 
killed before real start
e1e94ed is described below

commit e1e94ed4ef45ef81814f1b920bac0afa52ae06a2
Author: yi.wu 
AuthorDate: Mon Sep 21 23:20:18 2020 -0700

[SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before 
real start

### What changes were proposed in this pull request?

Only calculate the executorRunTime when taskStartTime > 0. Otherwise, set 
executorRunTime to 0.

### Why are the changes needed?

bug fix.

It's possible that a task be killed (e.g., by another successful attempt) 
before it reaches "taskStartTime = System.currentTimeMillis()". In this case, 
taskStartTime is still 0 since it hasn't been really initialized. And we will 
get the wrong executorRunTime by calculating System.currentTimeMillis() - 
taskStartTime.

### Does this PR introduce _any_ user-facing change?

Yes, users will see the correct executorRunTime.

### How was this patch tested?

Pass existing tests.

Closes #29832 from Ngone51/backport-spark-3289.

Authored-by: yi.wu 
Signed-off-by: Dongjoon Hyun 
---
 core/src/main/scala/org/apache/spark/executor/Executor.scala | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/core/src/main/scala/org/apache/spark/executor/Executor.scala 
b/core/src/main/scala/org/apache/spark/executor/Executor.scala
index f7ff0b8..fe57b1c 100644
--- a/core/src/main/scala/org/apache/spark/executor/Executor.scala
+++ b/core/src/main/scala/org/apache/spark/executor/Executor.scala
@@ -337,7 +337,10 @@ private[spark] class Executor(
 private def collectAccumulatorsAndResetStatusOnFailure(taskStartTime: 
Long) = {
   // Report executor runtime and JVM gc time
   Option(task).foreach(t => {
-t.metrics.setExecutorRunTime(System.currentTimeMillis() - 
taskStartTime)
+t.metrics.setExecutorRunTime(
+  // SPARK-32898: it's possible that a task is killed when 
taskStartTime has the initial
+  // value(=0) still. In this case, the executorRunTime should be 
considered as 0.
+  if (taskStartTime > 0) System.currentTimeMillis() - taskStartTime 
else 0)
 t.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
   })
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-2.4 updated: [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before real start

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new e1e94ed  [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task 
killed before real start
e1e94ed is described below

commit e1e94ed4ef45ef81814f1b920bac0afa52ae06a2
Author: yi.wu 
AuthorDate: Mon Sep 21 23:20:18 2020 -0700

[SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before 
real start

### What changes were proposed in this pull request?

Only calculate the executorRunTime when taskStartTime > 0. Otherwise, set 
executorRunTime to 0.

### Why are the changes needed?

bug fix.

It's possible that a task be killed (e.g., by another successful attempt) 
before it reaches "taskStartTime = System.currentTimeMillis()". In this case, 
taskStartTime is still 0 since it hasn't been really initialized. And we will 
get the wrong executorRunTime by calculating System.currentTimeMillis() - 
taskStartTime.

### Does this PR introduce _any_ user-facing change?

Yes, users will see the correct executorRunTime.

### How was this patch tested?

Pass existing tests.

Closes #29832 from Ngone51/backport-spark-3289.

Authored-by: yi.wu 
Signed-off-by: Dongjoon Hyun 
---
 core/src/main/scala/org/apache/spark/executor/Executor.scala | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/core/src/main/scala/org/apache/spark/executor/Executor.scala 
b/core/src/main/scala/org/apache/spark/executor/Executor.scala
index f7ff0b8..fe57b1c 100644
--- a/core/src/main/scala/org/apache/spark/executor/Executor.scala
+++ b/core/src/main/scala/org/apache/spark/executor/Executor.scala
@@ -337,7 +337,10 @@ private[spark] class Executor(
 private def collectAccumulatorsAndResetStatusOnFailure(taskStartTime: 
Long) = {
   // Report executor runtime and JVM gc time
   Option(task).foreach(t => {
-t.metrics.setExecutorRunTime(System.currentTimeMillis() - 
taskStartTime)
+t.metrics.setExecutorRunTime(
+  // SPARK-32898: it's possible that a task is killed when 
taskStartTime has the initial
+  // value(=0) still. In this case, the executorRunTime should be 
considered as 0.
+  if (taskStartTime > 0) System.currentTimeMillis() - taskStartTime 
else 0)
 t.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
   })
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-2.4 updated: [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before real start

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new e1e94ed  [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task 
killed before real start
e1e94ed is described below

commit e1e94ed4ef45ef81814f1b920bac0afa52ae06a2
Author: yi.wu 
AuthorDate: Mon Sep 21 23:20:18 2020 -0700

[SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before 
real start

### What changes were proposed in this pull request?

Only calculate the executorRunTime when taskStartTime > 0. Otherwise, set 
executorRunTime to 0.

### Why are the changes needed?

bug fix.

It's possible that a task be killed (e.g., by another successful attempt) 
before it reaches "taskStartTime = System.currentTimeMillis()". In this case, 
taskStartTime is still 0 since it hasn't been really initialized. And we will 
get the wrong executorRunTime by calculating System.currentTimeMillis() - 
taskStartTime.

### Does this PR introduce _any_ user-facing change?

Yes, users will see the correct executorRunTime.

### How was this patch tested?

Pass existing tests.

Closes #29832 from Ngone51/backport-spark-3289.

Authored-by: yi.wu 
Signed-off-by: Dongjoon Hyun 
---
 core/src/main/scala/org/apache/spark/executor/Executor.scala | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/core/src/main/scala/org/apache/spark/executor/Executor.scala 
b/core/src/main/scala/org/apache/spark/executor/Executor.scala
index f7ff0b8..fe57b1c 100644
--- a/core/src/main/scala/org/apache/spark/executor/Executor.scala
+++ b/core/src/main/scala/org/apache/spark/executor/Executor.scala
@@ -337,7 +337,10 @@ private[spark] class Executor(
 private def collectAccumulatorsAndResetStatusOnFailure(taskStartTime: 
Long) = {
   // Report executor runtime and JVM gc time
   Option(task).foreach(t => {
-t.metrics.setExecutorRunTime(System.currentTimeMillis() - 
taskStartTime)
+t.metrics.setExecutorRunTime(
+  // SPARK-32898: it's possible that a task is killed when 
taskStartTime has the initial
+  // value(=0) still. In this case, the executorRunTime should be 
considered as 0.
+  if (taskStartTime > 0) System.currentTimeMillis() - taskStartTime 
else 0)
 t.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
   })
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-2.4 updated: [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before real start

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new e1e94ed  [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task 
killed before real start
e1e94ed is described below

commit e1e94ed4ef45ef81814f1b920bac0afa52ae06a2
Author: yi.wu 
AuthorDate: Mon Sep 21 23:20:18 2020 -0700

[SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before 
real start

### What changes were proposed in this pull request?

Only calculate the executorRunTime when taskStartTime > 0. Otherwise, set 
executorRunTime to 0.

### Why are the changes needed?

bug fix.

It's possible that a task be killed (e.g., by another successful attempt) 
before it reaches "taskStartTime = System.currentTimeMillis()". In this case, 
taskStartTime is still 0 since it hasn't been really initialized. And we will 
get the wrong executorRunTime by calculating System.currentTimeMillis() - 
taskStartTime.

### Does this PR introduce _any_ user-facing change?

Yes, users will see the correct executorRunTime.

### How was this patch tested?

Pass existing tests.

Closes #29832 from Ngone51/backport-spark-3289.

Authored-by: yi.wu 
Signed-off-by: Dongjoon Hyun 
---
 core/src/main/scala/org/apache/spark/executor/Executor.scala | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/core/src/main/scala/org/apache/spark/executor/Executor.scala 
b/core/src/main/scala/org/apache/spark/executor/Executor.scala
index f7ff0b8..fe57b1c 100644
--- a/core/src/main/scala/org/apache/spark/executor/Executor.scala
+++ b/core/src/main/scala/org/apache/spark/executor/Executor.scala
@@ -337,7 +337,10 @@ private[spark] class Executor(
 private def collectAccumulatorsAndResetStatusOnFailure(taskStartTime: 
Long) = {
   // Report executor runtime and JVM gc time
   Option(task).foreach(t => {
-t.metrics.setExecutorRunTime(System.currentTimeMillis() - 
taskStartTime)
+t.metrics.setExecutorRunTime(
+  // SPARK-32898: it's possible that a task is killed when 
taskStartTime has the initial
+  // value(=0) still. In this case, the executorRunTime should be 
considered as 0.
+  if (taskStartTime > 0) System.currentTimeMillis() - taskStartTime 
else 0)
 t.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
   })
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-2.4 updated: [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before real start

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new e1e94ed  [SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task 
killed before real start
e1e94ed is described below

commit e1e94ed4ef45ef81814f1b920bac0afa52ae06a2
Author: yi.wu 
AuthorDate: Mon Sep 21 23:20:18 2020 -0700

[SPARK-32898][2.4][CORE] Fix wrong executorRunTime when task killed before 
real start

### What changes were proposed in this pull request?

Only calculate the executorRunTime when taskStartTime > 0. Otherwise, set 
executorRunTime to 0.

### Why are the changes needed?

bug fix.

It's possible that a task be killed (e.g., by another successful attempt) 
before it reaches "taskStartTime = System.currentTimeMillis()". In this case, 
taskStartTime is still 0 since it hasn't been really initialized. And we will 
get the wrong executorRunTime by calculating System.currentTimeMillis() - 
taskStartTime.

### Does this PR introduce _any_ user-facing change?

Yes, users will see the correct executorRunTime.

### How was this patch tested?

Pass existing tests.

Closes #29832 from Ngone51/backport-spark-3289.

Authored-by: yi.wu 
Signed-off-by: Dongjoon Hyun 
---
 core/src/main/scala/org/apache/spark/executor/Executor.scala | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/core/src/main/scala/org/apache/spark/executor/Executor.scala 
b/core/src/main/scala/org/apache/spark/executor/Executor.scala
index f7ff0b8..fe57b1c 100644
--- a/core/src/main/scala/org/apache/spark/executor/Executor.scala
+++ b/core/src/main/scala/org/apache/spark/executor/Executor.scala
@@ -337,7 +337,10 @@ private[spark] class Executor(
 private def collectAccumulatorsAndResetStatusOnFailure(taskStartTime: 
Long) = {
   // Report executor runtime and JVM gc time
   Option(task).foreach(t => {
-t.metrics.setExecutorRunTime(System.currentTimeMillis() - 
taskStartTime)
+t.metrics.setExecutorRunTime(
+  // SPARK-32898: it's possible that a task is killed when 
taskStartTime has the initial
+  // value(=0) still. In this case, the executorRunTime should be 
considered as 0.
+  if (taskStartTime > 0) System.currentTimeMillis() - taskStartTime 
else 0)
 t.metrics.setJvmGCTime(computeTotalGcTime() - startGCTime)
   })
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (3118c22 -> 790d9ef2d)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR
 add 790d9ef2d [SPARK-32955][DOCS] An item in the navigation bar in the 
WebUI has a wrong link

No new revisions were added by this update.

Summary of changes:
 docs/_layouts/global.html |  2 +-
 docs/api.md   | 27 ---
 2 files changed, 1 insertion(+), 28 deletions(-)
 delete mode 100644 docs/api.md


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (3118c22 -> 790d9ef2d)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR
 add 790d9ef2d [SPARK-32955][DOCS] An item in the navigation bar in the 
WebUI has a wrong link

No new revisions were added by this update.

Summary of changes:
 docs/_layouts/global.html |  2 +-
 docs/api.md   | 27 ---
 2 files changed, 1 insertion(+), 28 deletions(-)
 delete mode 100644 docs/api.md


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (3118c22 -> 790d9ef2d)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR
 add 790d9ef2d [SPARK-32955][DOCS] An item in the navigation bar in the 
WebUI has a wrong link

No new revisions were added by this update.

Summary of changes:
 docs/_layouts/global.html |  2 +-
 docs/api.md   | 27 ---
 2 files changed, 1 insertion(+), 28 deletions(-)
 delete mode 100644 docs/api.md


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (3118c22 -> 790d9ef2d)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR
 add 790d9ef2d [SPARK-32955][DOCS] An item in the navigation bar in the 
WebUI has a wrong link

No new revisions were added by this update.

Summary of changes:
 docs/_layouts/global.html |  2 +-
 docs/api.md   | 27 ---
 2 files changed, 1 insertion(+), 28 deletions(-)
 delete mode 100644 docs/api.md


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (3118c22 -> 790d9ef2d)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR
 add 790d9ef2d [SPARK-32955][DOCS] An item in the navigation bar in the 
WebUI has a wrong link

No new revisions were added by this update.

Summary of changes:
 docs/_layouts/global.html |  2 +-
 docs/api.md   | 27 ---
 2 files changed, 1 insertion(+), 28 deletions(-)
 delete mode 100644 docs/api.md


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (f03c035 -> 3118c22)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate
 add 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/functions.R   | 15 +++
 R/pkg/R/generics.R|  4 
 R/pkg/tests/fulltests/test_sparkSQL.R |  1 +
 4 files changed, 21 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (f03c035 -> 3118c22)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate
 add 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/functions.R   | 15 +++
 R/pkg/R/generics.R|  4 
 R/pkg/tests/fulltests/test_sparkSQL.R |  1 +
 4 files changed, 21 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (5440ea8 -> f03c035)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide
 add f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/optimizer/expressions.scala |  19 +-
 .../optimizer/FoldablePropagationSuite.scala   |  12 +
 .../approved-plans-v1_4/q14a.sf100/explain.txt |  50 +-
 .../approved-plans-v1_4/q14a.sf100/simplified.txt  |  12 +-
 .../approved-plans-v1_4/q14a/explain.txt   |  50 +-
 .../approved-plans-v1_4/q14a/simplified.txt|  12 +-
 .../approved-plans-v1_4/q14b.sf100/explain.txt |  30 +-
 .../approved-plans-v1_4/q14b.sf100/simplified.txt  |  10 +-
 .../approved-plans-v1_4/q14b/explain.txt   |  30 +-
 .../approved-plans-v1_4/q14b/simplified.txt|  10 +-
 .../approved-plans-v1_4/q41.sf100/explain.txt  |  12 +-
 .../approved-plans-v1_4/q41.sf100/simplified.txt   |   4 +-
 .../approved-plans-v1_4/q41/explain.txt|  12 +-
 .../approved-plans-v1_4/q41/simplified.txt |   4 +-
 .../approved-plans-v2_7/q14.sf100/explain.txt  |  30 +-
 .../approved-plans-v2_7/q14.sf100/simplified.txt   |  10 +-
 .../approved-plans-v2_7/q14/explain.txt|  30 +-
 .../approved-plans-v2_7/q14/simplified.txt |  10 +-
 .../approved-plans-v2_7/q14a.sf100/explain.txt | 530 ++---
 .../approved-plans-v2_7/q14a.sf100/simplified.txt  |  60 +--
 .../approved-plans-v2_7/q14a/explain.txt   | 530 ++---
 .../approved-plans-v2_7/q14a/simplified.txt|  60 +--
 22 files changed, 775 insertions(+), 752 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (f03c035 -> 3118c22)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate
 add 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/functions.R   | 15 +++
 R/pkg/R/generics.R|  4 
 R/pkg/tests/fulltests/test_sparkSQL.R |  1 +
 4 files changed, 21 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (f03c035 -> 3118c22)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate
 add 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/functions.R   | 15 +++
 R/pkg/R/generics.R|  4 
 R/pkg/tests/fulltests/test_sparkSQL.R |  1 +
 4 files changed, 21 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (f03c035 -> 3118c22)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate
 add 3118c22  [SPARK-32949][R][SQL] Add timestamp_seconds to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/functions.R   | 15 +++
 R/pkg/R/generics.R|  4 
 R/pkg/tests/fulltests/test_sparkSQL.R |  1 +
 4 files changed, 21 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (5440ea8 -> f03c035)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide
 add f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/optimizer/expressions.scala |  19 +-
 .../optimizer/FoldablePropagationSuite.scala   |  12 +
 .../approved-plans-v1_4/q14a.sf100/explain.txt |  50 +-
 .../approved-plans-v1_4/q14a.sf100/simplified.txt  |  12 +-
 .../approved-plans-v1_4/q14a/explain.txt   |  50 +-
 .../approved-plans-v1_4/q14a/simplified.txt|  12 +-
 .../approved-plans-v1_4/q14b.sf100/explain.txt |  30 +-
 .../approved-plans-v1_4/q14b.sf100/simplified.txt  |  10 +-
 .../approved-plans-v1_4/q14b/explain.txt   |  30 +-
 .../approved-plans-v1_4/q14b/simplified.txt|  10 +-
 .../approved-plans-v1_4/q41.sf100/explain.txt  |  12 +-
 .../approved-plans-v1_4/q41.sf100/simplified.txt   |   4 +-
 .../approved-plans-v1_4/q41/explain.txt|  12 +-
 .../approved-plans-v1_4/q41/simplified.txt |   4 +-
 .../approved-plans-v2_7/q14.sf100/explain.txt  |  30 +-
 .../approved-plans-v2_7/q14.sf100/simplified.txt   |  10 +-
 .../approved-plans-v2_7/q14/explain.txt|  30 +-
 .../approved-plans-v2_7/q14/simplified.txt |  10 +-
 .../approved-plans-v2_7/q14a.sf100/explain.txt | 530 ++---
 .../approved-plans-v2_7/q14a.sf100/simplified.txt  |  60 +--
 .../approved-plans-v2_7/q14a/explain.txt   | 530 ++---
 .../approved-plans-v2_7/q14a/simplified.txt|  60 +--
 22 files changed, 775 insertions(+), 752 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (5440ea8 -> f03c035)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide
 add f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/optimizer/expressions.scala |  19 +-
 .../optimizer/FoldablePropagationSuite.scala   |  12 +
 .../approved-plans-v1_4/q14a.sf100/explain.txt |  50 +-
 .../approved-plans-v1_4/q14a.sf100/simplified.txt  |  12 +-
 .../approved-plans-v1_4/q14a/explain.txt   |  50 +-
 .../approved-plans-v1_4/q14a/simplified.txt|  12 +-
 .../approved-plans-v1_4/q14b.sf100/explain.txt |  30 +-
 .../approved-plans-v1_4/q14b.sf100/simplified.txt  |  10 +-
 .../approved-plans-v1_4/q14b/explain.txt   |  30 +-
 .../approved-plans-v1_4/q14b/simplified.txt|  10 +-
 .../approved-plans-v1_4/q41.sf100/explain.txt  |  12 +-
 .../approved-plans-v1_4/q41.sf100/simplified.txt   |   4 +-
 .../approved-plans-v1_4/q41/explain.txt|  12 +-
 .../approved-plans-v1_4/q41/simplified.txt |   4 +-
 .../approved-plans-v2_7/q14.sf100/explain.txt  |  30 +-
 .../approved-plans-v2_7/q14.sf100/simplified.txt   |  10 +-
 .../approved-plans-v2_7/q14/explain.txt|  30 +-
 .../approved-plans-v2_7/q14/simplified.txt |  10 +-
 .../approved-plans-v2_7/q14a.sf100/explain.txt | 530 ++---
 .../approved-plans-v2_7/q14a.sf100/simplified.txt  |  60 +--
 .../approved-plans-v2_7/q14a/explain.txt   | 530 ++---
 .../approved-plans-v2_7/q14a/simplified.txt|  60 +--
 22 files changed, 775 insertions(+), 752 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (5440ea8 -> f03c035)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide
 add f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/optimizer/expressions.scala |  19 +-
 .../optimizer/FoldablePropagationSuite.scala   |  12 +
 .../approved-plans-v1_4/q14a.sf100/explain.txt |  50 +-
 .../approved-plans-v1_4/q14a.sf100/simplified.txt  |  12 +-
 .../approved-plans-v1_4/q14a/explain.txt   |  50 +-
 .../approved-plans-v1_4/q14a/simplified.txt|  12 +-
 .../approved-plans-v1_4/q14b.sf100/explain.txt |  30 +-
 .../approved-plans-v1_4/q14b.sf100/simplified.txt  |  10 +-
 .../approved-plans-v1_4/q14b/explain.txt   |  30 +-
 .../approved-plans-v1_4/q14b/simplified.txt|  10 +-
 .../approved-plans-v1_4/q41.sf100/explain.txt  |  12 +-
 .../approved-plans-v1_4/q41.sf100/simplified.txt   |   4 +-
 .../approved-plans-v1_4/q41/explain.txt|  12 +-
 .../approved-plans-v1_4/q41/simplified.txt |   4 +-
 .../approved-plans-v2_7/q14.sf100/explain.txt  |  30 +-
 .../approved-plans-v2_7/q14.sf100/simplified.txt   |  10 +-
 .../approved-plans-v2_7/q14/explain.txt|  30 +-
 .../approved-plans-v2_7/q14/simplified.txt |  10 +-
 .../approved-plans-v2_7/q14a.sf100/explain.txt | 530 ++---
 .../approved-plans-v2_7/q14a.sf100/simplified.txt  |  60 +--
 .../approved-plans-v2_7/q14a/explain.txt   | 530 ++---
 .../approved-plans-v2_7/q14a/simplified.txt|  60 +--
 22 files changed, 775 insertions(+), 752 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (5440ea8 -> f03c035)

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide
 add f03c035  [SPARK-32951][SQL] Foldable propagation from Aggregate

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/optimizer/expressions.scala |  19 +-
 .../optimizer/FoldablePropagationSuite.scala   |  12 +
 .../approved-plans-v1_4/q14a.sf100/explain.txt |  50 +-
 .../approved-plans-v1_4/q14a.sf100/simplified.txt  |  12 +-
 .../approved-plans-v1_4/q14a/explain.txt   |  50 +-
 .../approved-plans-v1_4/q14a/simplified.txt|  12 +-
 .../approved-plans-v1_4/q14b.sf100/explain.txt |  30 +-
 .../approved-plans-v1_4/q14b.sf100/simplified.txt  |  10 +-
 .../approved-plans-v1_4/q14b/explain.txt   |  30 +-
 .../approved-plans-v1_4/q14b/simplified.txt|  10 +-
 .../approved-plans-v1_4/q41.sf100/explain.txt  |  12 +-
 .../approved-plans-v1_4/q41.sf100/simplified.txt   |   4 +-
 .../approved-plans-v1_4/q41/explain.txt|  12 +-
 .../approved-plans-v1_4/q41/simplified.txt |   4 +-
 .../approved-plans-v2_7/q14.sf100/explain.txt  |  30 +-
 .../approved-plans-v2_7/q14.sf100/simplified.txt   |  10 +-
 .../approved-plans-v2_7/q14/explain.txt|  30 +-
 .../approved-plans-v2_7/q14/simplified.txt |  10 +-
 .../approved-plans-v2_7/q14a.sf100/explain.txt | 530 ++---
 .../approved-plans-v2_7/q14a.sf100/simplified.txt  |  60 +--
 .../approved-plans-v2_7/q14a/explain.txt   | 530 ++---
 .../approved-plans-v2_7/q14a/simplified.txt|  60 +--
 22 files changed, 775 insertions(+), 752 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (d01594e -> 5440ea8)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
 add 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide

No new revisions were added by this update.

Summary of changes:
 python/docs/source/getting_started/install.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (d01594e -> 5440ea8)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
 add 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide

No new revisions were added by this update.

Summary of changes:
 python/docs/source/getting_started/install.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (d01594e -> 5440ea8)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
 add 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide

No new revisions were added by this update.

Summary of changes:
 python/docs/source/getting_started/install.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (d01594e -> 5440ea8)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
 add 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide

No new revisions were added by this update.

Summary of changes:
 python/docs/source/getting_started/install.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (d01594e -> 5440ea8)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
 add 5440ea8  [SPARK-32312][DOC][FOLLOWUP] Fix the minimum version of 
PyArrow in the installation guide

No new revisions were added by this update.

Summary of changes:
 python/docs/source/getting_started/install.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.0 updated: [SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new b27  [SPARK-32718][SQL][3.0] Remove unnecessary keywords for 
interval units
b27 is described below

commit b27be46572feebf549a2314b6c9ea0d39c8a
Author: Wenchen Fan 
AuthorDate: Mon Sep 21 14:06:54 2020 -0700

[SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

Backport https://github.com/apache/spark/pull/29560 to 3.0, as it's kind of 
a bug fix for the ANSI mode. People can't use `year`,  `month`, etc. functions 
under ANSI mode.

Closes #29823 from cloud-fan/backport.

Authored-by: Wenchen Fan 
Signed-off-by: Dongjoon Hyun 
---
 docs/sql-ref-ansi-compliance.md|  6 --
 .../apache/spark/sql/catalyst/parser/SqlBase.g4| 26 +
 .../spark/sql/catalyst/parser/AstBuilder.scala |  2 +-
 .../test/resources/sql-tests/inputs/interval.sql   |  4 ++
 .../sql-tests/results/ansi/datetime.sql.out| 10 +---
 .../sql-tests/results/ansi/interval.sql.out| 64 +++---
 .../resources/sql-tests/results/interval.sql.out   | 42 +-
 .../org/apache/spark/sql/TPCDSQuerySuite.scala |  6 ++
 8 files changed, 100 insertions(+), 60 deletions(-)

diff --git a/docs/sql-ref-ansi-compliance.md b/docs/sql-ref-ansi-compliance.md
index 1936161..948a36e 100644
--- a/docs/sql-ref-ansi-compliance.md
+++ b/docs/sql-ref-ansi-compliance.md
@@ -181,7 +181,6 @@ Below is a list of all the keywords in Spark SQL.
 |DATA|non-reserved|non-reserved|non-reserved|
 |DATABASE|non-reserved|non-reserved|non-reserved|
 |DATABASES|non-reserved|non-reserved|non-reserved|
-|DAY|reserved|non-reserved|reserved|
 |DBPROPERTIES|non-reserved|non-reserved|non-reserved|
 |DEFINED|non-reserved|non-reserved|non-reserved|
 |DELETE|non-reserved|non-reserved|reserved|
@@ -227,7 +226,6 @@ Below is a list of all the keywords in Spark SQL.
 |GROUP|reserved|non-reserved|reserved|
 |GROUPING|non-reserved|non-reserved|reserved|
 |HAVING|reserved|non-reserved|reserved|
-|HOUR|reserved|non-reserved|reserved|
 |IF|non-reserved|non-reserved|not a keyword|
 |IGNORE|non-reserved|non-reserved|non-reserved|
 |IMPORT|non-reserved|non-reserved|non-reserved|
@@ -265,8 +263,6 @@ Below is a list of all the keywords in Spark SQL.
 |MATCHED|non-reserved|non-reserved|non-reserved|
 |MERGE|non-reserved|non-reserved|non-reserved|
 |MINUS|non-reserved|strict-non-reserved|non-reserved|
-|MINUTE|reserved|non-reserved|reserved|
-|MONTH|reserved|non-reserved|reserved|
 |MSCK|non-reserved|non-reserved|non-reserved|
 |NAMESPACE|non-reserved|non-reserved|non-reserved|
 |NAMESPACES|non-reserved|non-reserved|non-reserved|
@@ -326,7 +322,6 @@ Below is a list of all the keywords in Spark SQL.
 |ROWS|non-reserved|non-reserved|reserved|
 |SCHEMA|non-reserved|non-reserved|non-reserved|
 |SCHEMAS|non-reserved|non-reserved|not a keyword|
-|SECOND|reserved|non-reserved|reserved|
 |SELECT|reserved|non-reserved|reserved|
 |SEMI|non-reserved|strict-non-reserved|non-reserved|
 |SEPARATED|non-reserved|non-reserved|non-reserved|
@@ -384,4 +379,3 @@ Below is a list of all the keywords in Spark SQL.
 |WHERE|reserved|non-reserved|reserved|
 |WINDOW|non-reserved|non-reserved|reserved|
 |WITH|reserved|non-reserved|reserved|
-|YEAR|reserved|non-reserved|reserved|
diff --git 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
index df6ff9f..922ff10 100644
--- 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
+++ 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
@@ -818,7 +818,7 @@ errorCapturingMultiUnitsInterval
 ;
 
 multiUnitsInterval
-: (intervalValue intervalUnit)+
+: (intervalValue unit+=identifier)+
 ;
 
 errorCapturingUnitToUnitInterval
@@ -826,7 +826,7 @@ errorCapturingUnitToUnitInterval
 ;
 
 unitToUnitInterval
-: value=intervalValue from=intervalUnit TO to=intervalUnit
+: value=intervalValue from=identifier TO to=identifier
 ;
 
 intervalValue
@@ -834,16 +834,6 @@ intervalValue
 | STRING
 ;
 
-intervalUnit
-: DAY
-| HOUR
-| MINUTE
-| MONTH
-| SECOND
-| YEAR
-| identifier
-;
-
 colPosition
 : position=FIRST | position=AFTER afterCol=errorCapturingIdentifier
 ;
@@ -1251,7 +1241,6 @@ nonReserved
 | DATA
 | DATABASE
 | DATABASES
-| DAY
 | DBPROPERTIES
 | DEFINED
 | DELETE
@@ -1295,7 +1284,6 @@ nonReserved
 | GROUP
 | GROUPING
 | HAVING
-| HOUR
 | IF
 | IGNORE
 | IMPORT
@@ -1328,8 +1316,6 @@ nonReserved
 | MAP
 | MATCHED
 | MERGE
-| MINUTE
-| MONTH
 | MSCK
 | NAMESPACE
 | NAMESPACES
@@ -1384,7 +1370

[spark] branch branch-3.0 updated: [SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new b27  [SPARK-32718][SQL][3.0] Remove unnecessary keywords for 
interval units
b27 is described below

commit b27be46572feebf549a2314b6c9ea0d39c8a
Author: Wenchen Fan 
AuthorDate: Mon Sep 21 14:06:54 2020 -0700

[SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

Backport https://github.com/apache/spark/pull/29560 to 3.0, as it's kind of 
a bug fix for the ANSI mode. People can't use `year`,  `month`, etc. functions 
under ANSI mode.

Closes #29823 from cloud-fan/backport.

Authored-by: Wenchen Fan 
Signed-off-by: Dongjoon Hyun 
---
 docs/sql-ref-ansi-compliance.md|  6 --
 .../apache/spark/sql/catalyst/parser/SqlBase.g4| 26 +
 .../spark/sql/catalyst/parser/AstBuilder.scala |  2 +-
 .../test/resources/sql-tests/inputs/interval.sql   |  4 ++
 .../sql-tests/results/ansi/datetime.sql.out| 10 +---
 .../sql-tests/results/ansi/interval.sql.out| 64 +++---
 .../resources/sql-tests/results/interval.sql.out   | 42 +-
 .../org/apache/spark/sql/TPCDSQuerySuite.scala |  6 ++
 8 files changed, 100 insertions(+), 60 deletions(-)

diff --git a/docs/sql-ref-ansi-compliance.md b/docs/sql-ref-ansi-compliance.md
index 1936161..948a36e 100644
--- a/docs/sql-ref-ansi-compliance.md
+++ b/docs/sql-ref-ansi-compliance.md
@@ -181,7 +181,6 @@ Below is a list of all the keywords in Spark SQL.
 |DATA|non-reserved|non-reserved|non-reserved|
 |DATABASE|non-reserved|non-reserved|non-reserved|
 |DATABASES|non-reserved|non-reserved|non-reserved|
-|DAY|reserved|non-reserved|reserved|
 |DBPROPERTIES|non-reserved|non-reserved|non-reserved|
 |DEFINED|non-reserved|non-reserved|non-reserved|
 |DELETE|non-reserved|non-reserved|reserved|
@@ -227,7 +226,6 @@ Below is a list of all the keywords in Spark SQL.
 |GROUP|reserved|non-reserved|reserved|
 |GROUPING|non-reserved|non-reserved|reserved|
 |HAVING|reserved|non-reserved|reserved|
-|HOUR|reserved|non-reserved|reserved|
 |IF|non-reserved|non-reserved|not a keyword|
 |IGNORE|non-reserved|non-reserved|non-reserved|
 |IMPORT|non-reserved|non-reserved|non-reserved|
@@ -265,8 +263,6 @@ Below is a list of all the keywords in Spark SQL.
 |MATCHED|non-reserved|non-reserved|non-reserved|
 |MERGE|non-reserved|non-reserved|non-reserved|
 |MINUS|non-reserved|strict-non-reserved|non-reserved|
-|MINUTE|reserved|non-reserved|reserved|
-|MONTH|reserved|non-reserved|reserved|
 |MSCK|non-reserved|non-reserved|non-reserved|
 |NAMESPACE|non-reserved|non-reserved|non-reserved|
 |NAMESPACES|non-reserved|non-reserved|non-reserved|
@@ -326,7 +322,6 @@ Below is a list of all the keywords in Spark SQL.
 |ROWS|non-reserved|non-reserved|reserved|
 |SCHEMA|non-reserved|non-reserved|non-reserved|
 |SCHEMAS|non-reserved|non-reserved|not a keyword|
-|SECOND|reserved|non-reserved|reserved|
 |SELECT|reserved|non-reserved|reserved|
 |SEMI|non-reserved|strict-non-reserved|non-reserved|
 |SEPARATED|non-reserved|non-reserved|non-reserved|
@@ -384,4 +379,3 @@ Below is a list of all the keywords in Spark SQL.
 |WHERE|reserved|non-reserved|reserved|
 |WINDOW|non-reserved|non-reserved|reserved|
 |WITH|reserved|non-reserved|reserved|
-|YEAR|reserved|non-reserved|reserved|
diff --git 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
index df6ff9f..922ff10 100644
--- 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
+++ 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
@@ -818,7 +818,7 @@ errorCapturingMultiUnitsInterval
 ;
 
 multiUnitsInterval
-: (intervalValue intervalUnit)+
+: (intervalValue unit+=identifier)+
 ;
 
 errorCapturingUnitToUnitInterval
@@ -826,7 +826,7 @@ errorCapturingUnitToUnitInterval
 ;
 
 unitToUnitInterval
-: value=intervalValue from=intervalUnit TO to=intervalUnit
+: value=intervalValue from=identifier TO to=identifier
 ;
 
 intervalValue
@@ -834,16 +834,6 @@ intervalValue
 | STRING
 ;
 
-intervalUnit
-: DAY
-| HOUR
-| MINUTE
-| MONTH
-| SECOND
-| YEAR
-| identifier
-;
-
 colPosition
 : position=FIRST | position=AFTER afterCol=errorCapturingIdentifier
 ;
@@ -1251,7 +1241,6 @@ nonReserved
 | DATA
 | DATABASE
 | DATABASES
-| DAY
 | DBPROPERTIES
 | DEFINED
 | DELETE
@@ -1295,7 +1284,6 @@ nonReserved
 | GROUP
 | GROUPING
 | HAVING
-| HOUR
 | IF
 | IGNORE
 | IMPORT
@@ -1328,8 +1316,6 @@ nonReserved
 | MAP
 | MATCHED
 | MERGE
-| MINUTE
-| MONTH
 | MSCK
 | NAMESPACE
 | NAMESPACES
@@ -1384,7 +1370

[spark] branch branch-3.0 updated: [SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new b27  [SPARK-32718][SQL][3.0] Remove unnecessary keywords for 
interval units
b27 is described below

commit b27be46572feebf549a2314b6c9ea0d39c8a
Author: Wenchen Fan 
AuthorDate: Mon Sep 21 14:06:54 2020 -0700

[SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

Backport https://github.com/apache/spark/pull/29560 to 3.0, as it's kind of 
a bug fix for the ANSI mode. People can't use `year`,  `month`, etc. functions 
under ANSI mode.

Closes #29823 from cloud-fan/backport.

Authored-by: Wenchen Fan 
Signed-off-by: Dongjoon Hyun 
---
 docs/sql-ref-ansi-compliance.md|  6 --
 .../apache/spark/sql/catalyst/parser/SqlBase.g4| 26 +
 .../spark/sql/catalyst/parser/AstBuilder.scala |  2 +-
 .../test/resources/sql-tests/inputs/interval.sql   |  4 ++
 .../sql-tests/results/ansi/datetime.sql.out| 10 +---
 .../sql-tests/results/ansi/interval.sql.out| 64 +++---
 .../resources/sql-tests/results/interval.sql.out   | 42 +-
 .../org/apache/spark/sql/TPCDSQuerySuite.scala |  6 ++
 8 files changed, 100 insertions(+), 60 deletions(-)

diff --git a/docs/sql-ref-ansi-compliance.md b/docs/sql-ref-ansi-compliance.md
index 1936161..948a36e 100644
--- a/docs/sql-ref-ansi-compliance.md
+++ b/docs/sql-ref-ansi-compliance.md
@@ -181,7 +181,6 @@ Below is a list of all the keywords in Spark SQL.
 |DATA|non-reserved|non-reserved|non-reserved|
 |DATABASE|non-reserved|non-reserved|non-reserved|
 |DATABASES|non-reserved|non-reserved|non-reserved|
-|DAY|reserved|non-reserved|reserved|
 |DBPROPERTIES|non-reserved|non-reserved|non-reserved|
 |DEFINED|non-reserved|non-reserved|non-reserved|
 |DELETE|non-reserved|non-reserved|reserved|
@@ -227,7 +226,6 @@ Below is a list of all the keywords in Spark SQL.
 |GROUP|reserved|non-reserved|reserved|
 |GROUPING|non-reserved|non-reserved|reserved|
 |HAVING|reserved|non-reserved|reserved|
-|HOUR|reserved|non-reserved|reserved|
 |IF|non-reserved|non-reserved|not a keyword|
 |IGNORE|non-reserved|non-reserved|non-reserved|
 |IMPORT|non-reserved|non-reserved|non-reserved|
@@ -265,8 +263,6 @@ Below is a list of all the keywords in Spark SQL.
 |MATCHED|non-reserved|non-reserved|non-reserved|
 |MERGE|non-reserved|non-reserved|non-reserved|
 |MINUS|non-reserved|strict-non-reserved|non-reserved|
-|MINUTE|reserved|non-reserved|reserved|
-|MONTH|reserved|non-reserved|reserved|
 |MSCK|non-reserved|non-reserved|non-reserved|
 |NAMESPACE|non-reserved|non-reserved|non-reserved|
 |NAMESPACES|non-reserved|non-reserved|non-reserved|
@@ -326,7 +322,6 @@ Below is a list of all the keywords in Spark SQL.
 |ROWS|non-reserved|non-reserved|reserved|
 |SCHEMA|non-reserved|non-reserved|non-reserved|
 |SCHEMAS|non-reserved|non-reserved|not a keyword|
-|SECOND|reserved|non-reserved|reserved|
 |SELECT|reserved|non-reserved|reserved|
 |SEMI|non-reserved|strict-non-reserved|non-reserved|
 |SEPARATED|non-reserved|non-reserved|non-reserved|
@@ -384,4 +379,3 @@ Below is a list of all the keywords in Spark SQL.
 |WHERE|reserved|non-reserved|reserved|
 |WINDOW|non-reserved|non-reserved|reserved|
 |WITH|reserved|non-reserved|reserved|
-|YEAR|reserved|non-reserved|reserved|
diff --git 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
index df6ff9f..922ff10 100644
--- 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
+++ 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
@@ -818,7 +818,7 @@ errorCapturingMultiUnitsInterval
 ;
 
 multiUnitsInterval
-: (intervalValue intervalUnit)+
+: (intervalValue unit+=identifier)+
 ;
 
 errorCapturingUnitToUnitInterval
@@ -826,7 +826,7 @@ errorCapturingUnitToUnitInterval
 ;
 
 unitToUnitInterval
-: value=intervalValue from=intervalUnit TO to=intervalUnit
+: value=intervalValue from=identifier TO to=identifier
 ;
 
 intervalValue
@@ -834,16 +834,6 @@ intervalValue
 | STRING
 ;
 
-intervalUnit
-: DAY
-| HOUR
-| MINUTE
-| MONTH
-| SECOND
-| YEAR
-| identifier
-;
-
 colPosition
 : position=FIRST | position=AFTER afterCol=errorCapturingIdentifier
 ;
@@ -1251,7 +1241,6 @@ nonReserved
 | DATA
 | DATABASE
 | DATABASES
-| DAY
 | DBPROPERTIES
 | DEFINED
 | DELETE
@@ -1295,7 +1284,6 @@ nonReserved
 | GROUP
 | GROUPING
 | HAVING
-| HOUR
 | IF
 | IGNORE
 | IMPORT
@@ -1328,8 +1316,6 @@ nonReserved
 | MAP
 | MATCHED
 | MERGE
-| MINUTE
-| MONTH
 | MSCK
 | NAMESPACE
 | NAMESPACES
@@ -1384,7 +1370

[spark] branch branch-3.0 updated: [SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new b27  [SPARK-32718][SQL][3.0] Remove unnecessary keywords for 
interval units
b27 is described below

commit b27be46572feebf549a2314b6c9ea0d39c8a
Author: Wenchen Fan 
AuthorDate: Mon Sep 21 14:06:54 2020 -0700

[SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

Backport https://github.com/apache/spark/pull/29560 to 3.0, as it's kind of 
a bug fix for the ANSI mode. People can't use `year`,  `month`, etc. functions 
under ANSI mode.

Closes #29823 from cloud-fan/backport.

Authored-by: Wenchen Fan 
Signed-off-by: Dongjoon Hyun 
---
 docs/sql-ref-ansi-compliance.md|  6 --
 .../apache/spark/sql/catalyst/parser/SqlBase.g4| 26 +
 .../spark/sql/catalyst/parser/AstBuilder.scala |  2 +-
 .../test/resources/sql-tests/inputs/interval.sql   |  4 ++
 .../sql-tests/results/ansi/datetime.sql.out| 10 +---
 .../sql-tests/results/ansi/interval.sql.out| 64 +++---
 .../resources/sql-tests/results/interval.sql.out   | 42 +-
 .../org/apache/spark/sql/TPCDSQuerySuite.scala |  6 ++
 8 files changed, 100 insertions(+), 60 deletions(-)

diff --git a/docs/sql-ref-ansi-compliance.md b/docs/sql-ref-ansi-compliance.md
index 1936161..948a36e 100644
--- a/docs/sql-ref-ansi-compliance.md
+++ b/docs/sql-ref-ansi-compliance.md
@@ -181,7 +181,6 @@ Below is a list of all the keywords in Spark SQL.
 |DATA|non-reserved|non-reserved|non-reserved|
 |DATABASE|non-reserved|non-reserved|non-reserved|
 |DATABASES|non-reserved|non-reserved|non-reserved|
-|DAY|reserved|non-reserved|reserved|
 |DBPROPERTIES|non-reserved|non-reserved|non-reserved|
 |DEFINED|non-reserved|non-reserved|non-reserved|
 |DELETE|non-reserved|non-reserved|reserved|
@@ -227,7 +226,6 @@ Below is a list of all the keywords in Spark SQL.
 |GROUP|reserved|non-reserved|reserved|
 |GROUPING|non-reserved|non-reserved|reserved|
 |HAVING|reserved|non-reserved|reserved|
-|HOUR|reserved|non-reserved|reserved|
 |IF|non-reserved|non-reserved|not a keyword|
 |IGNORE|non-reserved|non-reserved|non-reserved|
 |IMPORT|non-reserved|non-reserved|non-reserved|
@@ -265,8 +263,6 @@ Below is a list of all the keywords in Spark SQL.
 |MATCHED|non-reserved|non-reserved|non-reserved|
 |MERGE|non-reserved|non-reserved|non-reserved|
 |MINUS|non-reserved|strict-non-reserved|non-reserved|
-|MINUTE|reserved|non-reserved|reserved|
-|MONTH|reserved|non-reserved|reserved|
 |MSCK|non-reserved|non-reserved|non-reserved|
 |NAMESPACE|non-reserved|non-reserved|non-reserved|
 |NAMESPACES|non-reserved|non-reserved|non-reserved|
@@ -326,7 +322,6 @@ Below is a list of all the keywords in Spark SQL.
 |ROWS|non-reserved|non-reserved|reserved|
 |SCHEMA|non-reserved|non-reserved|non-reserved|
 |SCHEMAS|non-reserved|non-reserved|not a keyword|
-|SECOND|reserved|non-reserved|reserved|
 |SELECT|reserved|non-reserved|reserved|
 |SEMI|non-reserved|strict-non-reserved|non-reserved|
 |SEPARATED|non-reserved|non-reserved|non-reserved|
@@ -384,4 +379,3 @@ Below is a list of all the keywords in Spark SQL.
 |WHERE|reserved|non-reserved|reserved|
 |WINDOW|non-reserved|non-reserved|reserved|
 |WITH|reserved|non-reserved|reserved|
-|YEAR|reserved|non-reserved|reserved|
diff --git 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
index df6ff9f..922ff10 100644
--- 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
+++ 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
@@ -818,7 +818,7 @@ errorCapturingMultiUnitsInterval
 ;
 
 multiUnitsInterval
-: (intervalValue intervalUnit)+
+: (intervalValue unit+=identifier)+
 ;
 
 errorCapturingUnitToUnitInterval
@@ -826,7 +826,7 @@ errorCapturingUnitToUnitInterval
 ;
 
 unitToUnitInterval
-: value=intervalValue from=intervalUnit TO to=intervalUnit
+: value=intervalValue from=identifier TO to=identifier
 ;
 
 intervalValue
@@ -834,16 +834,6 @@ intervalValue
 | STRING
 ;
 
-intervalUnit
-: DAY
-| HOUR
-| MINUTE
-| MONTH
-| SECOND
-| YEAR
-| identifier
-;
-
 colPosition
 : position=FIRST | position=AFTER afterCol=errorCapturingIdentifier
 ;
@@ -1251,7 +1241,6 @@ nonReserved
 | DATA
 | DATABASE
 | DATABASES
-| DAY
 | DBPROPERTIES
 | DEFINED
 | DELETE
@@ -1295,7 +1284,6 @@ nonReserved
 | GROUP
 | GROUPING
 | HAVING
-| HOUR
 | IF
 | IGNORE
 | IMPORT
@@ -1328,8 +1316,6 @@ nonReserved
 | MAP
 | MATCHED
 | MERGE
-| MINUTE
-| MONTH
 | MSCK
 | NAMESPACE
 | NAMESPACES
@@ -1384,7 +1370

[spark] branch branch-3.0 updated: [SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

2020-09-21 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new b27  [SPARK-32718][SQL][3.0] Remove unnecessary keywords for 
interval units
b27 is described below

commit b27be46572feebf549a2314b6c9ea0d39c8a
Author: Wenchen Fan 
AuthorDate: Mon Sep 21 14:06:54 2020 -0700

[SPARK-32718][SQL][3.0] Remove unnecessary keywords for interval units

Backport https://github.com/apache/spark/pull/29560 to 3.0, as it's kind of 
a bug fix for the ANSI mode. People can't use `year`,  `month`, etc. functions 
under ANSI mode.

Closes #29823 from cloud-fan/backport.

Authored-by: Wenchen Fan 
Signed-off-by: Dongjoon Hyun 
---
 docs/sql-ref-ansi-compliance.md|  6 --
 .../apache/spark/sql/catalyst/parser/SqlBase.g4| 26 +
 .../spark/sql/catalyst/parser/AstBuilder.scala |  2 +-
 .../test/resources/sql-tests/inputs/interval.sql   |  4 ++
 .../sql-tests/results/ansi/datetime.sql.out| 10 +---
 .../sql-tests/results/ansi/interval.sql.out| 64 +++---
 .../resources/sql-tests/results/interval.sql.out   | 42 +-
 .../org/apache/spark/sql/TPCDSQuerySuite.scala |  6 ++
 8 files changed, 100 insertions(+), 60 deletions(-)

diff --git a/docs/sql-ref-ansi-compliance.md b/docs/sql-ref-ansi-compliance.md
index 1936161..948a36e 100644
--- a/docs/sql-ref-ansi-compliance.md
+++ b/docs/sql-ref-ansi-compliance.md
@@ -181,7 +181,6 @@ Below is a list of all the keywords in Spark SQL.
 |DATA|non-reserved|non-reserved|non-reserved|
 |DATABASE|non-reserved|non-reserved|non-reserved|
 |DATABASES|non-reserved|non-reserved|non-reserved|
-|DAY|reserved|non-reserved|reserved|
 |DBPROPERTIES|non-reserved|non-reserved|non-reserved|
 |DEFINED|non-reserved|non-reserved|non-reserved|
 |DELETE|non-reserved|non-reserved|reserved|
@@ -227,7 +226,6 @@ Below is a list of all the keywords in Spark SQL.
 |GROUP|reserved|non-reserved|reserved|
 |GROUPING|non-reserved|non-reserved|reserved|
 |HAVING|reserved|non-reserved|reserved|
-|HOUR|reserved|non-reserved|reserved|
 |IF|non-reserved|non-reserved|not a keyword|
 |IGNORE|non-reserved|non-reserved|non-reserved|
 |IMPORT|non-reserved|non-reserved|non-reserved|
@@ -265,8 +263,6 @@ Below is a list of all the keywords in Spark SQL.
 |MATCHED|non-reserved|non-reserved|non-reserved|
 |MERGE|non-reserved|non-reserved|non-reserved|
 |MINUS|non-reserved|strict-non-reserved|non-reserved|
-|MINUTE|reserved|non-reserved|reserved|
-|MONTH|reserved|non-reserved|reserved|
 |MSCK|non-reserved|non-reserved|non-reserved|
 |NAMESPACE|non-reserved|non-reserved|non-reserved|
 |NAMESPACES|non-reserved|non-reserved|non-reserved|
@@ -326,7 +322,6 @@ Below is a list of all the keywords in Spark SQL.
 |ROWS|non-reserved|non-reserved|reserved|
 |SCHEMA|non-reserved|non-reserved|non-reserved|
 |SCHEMAS|non-reserved|non-reserved|not a keyword|
-|SECOND|reserved|non-reserved|reserved|
 |SELECT|reserved|non-reserved|reserved|
 |SEMI|non-reserved|strict-non-reserved|non-reserved|
 |SEPARATED|non-reserved|non-reserved|non-reserved|
@@ -384,4 +379,3 @@ Below is a list of all the keywords in Spark SQL.
 |WHERE|reserved|non-reserved|reserved|
 |WINDOW|non-reserved|non-reserved|reserved|
 |WITH|reserved|non-reserved|reserved|
-|YEAR|reserved|non-reserved|reserved|
diff --git 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
index df6ff9f..922ff10 100644
--- 
a/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
+++ 
b/sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4
@@ -818,7 +818,7 @@ errorCapturingMultiUnitsInterval
 ;
 
 multiUnitsInterval
-: (intervalValue intervalUnit)+
+: (intervalValue unit+=identifier)+
 ;
 
 errorCapturingUnitToUnitInterval
@@ -826,7 +826,7 @@ errorCapturingUnitToUnitInterval
 ;
 
 unitToUnitInterval
-: value=intervalValue from=intervalUnit TO to=intervalUnit
+: value=intervalValue from=identifier TO to=identifier
 ;
 
 intervalValue
@@ -834,16 +834,6 @@ intervalValue
 | STRING
 ;
 
-intervalUnit
-: DAY
-| HOUR
-| MINUTE
-| MONTH
-| SECOND
-| YEAR
-| identifier
-;
-
 colPosition
 : position=FIRST | position=AFTER afterCol=errorCapturingIdentifier
 ;
@@ -1251,7 +1241,6 @@ nonReserved
 | DATA
 | DATABASE
 | DATABASES
-| DAY
 | DBPROPERTIES
 | DEFINED
 | DELETE
@@ -1295,7 +1284,6 @@ nonReserved
 | GROUP
 | GROUPING
 | HAVING
-| HOUR
 | IF
 | IGNORE
 | IMPORT
@@ -1328,8 +1316,6 @@ nonReserved
 | MAP
 | MATCHED
 | MERGE
-| MINUTE
-| MONTH
 | MSCK
 | NAMESPACE
 | NAMESPACES
@@ -1384,7 +1370

[spark] branch branch-2.4 updated: [HOTFIX][2.4] Revert SPARK-32886

2020-09-21 Thread sarutak
This is an automated email from the ASF dual-hosted git repository.

sarutak pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new 7d935ff  [HOTFIX][2.4] Revert SPARK-32886
7d935ff is described below

commit 7d935ffb4792e0c72b1ddc564815863341a103be
Author: Kousuke Saruta 
AuthorDate: Tue Sep 22 01:58:33 2020 +0900

[HOTFIX][2.4] Revert SPARK-32886

### What changes were proposed in this pull request?

This PR reverts SPARK-32886 (#29757) for branch-2.4.
That change needs `appBasePath` in `webui.js` but it's absent for 
`branch-2.4`.

Closes #29825 from sarutak/hotfix-for-SPARK-32886-2.4.

Authored-by: Kousuke Saruta 
Signed-off-by: Kousuke Saruta 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 --
 1 file changed, 20 insertions(+), 33 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 220b76a..5be8cff 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,31 +42,26 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
-  function getIdForJobEntry(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-return jobId;
-  }
-
-  function getSelectorForJobEntry(jobId) {
-return "#job-" + jobId;
-  }
-
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
+  var getSelectorForJobEntry = function(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+   return "#job-" + jobId;
+  };
+
   $(this).click(function() {
-var jobId = getIdForJobEntry(this);
-var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
-window.location.href = jobPagePath;
+var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
+  window.location.href = jobPagePath
   });
 
   $(this).hover(
 function() {
-  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
+  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -130,34 +125,26 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
-  function getStageIdAndAttemptForStageEntry(baseElem) {
-var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
-var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
-return stageIdAndAttempt;
-  }
-
-  function getSelectorForStageEntry(stageIdAndAttempt) {
-return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
-  }
-
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline-object").each(function() {
+  var getSelectorForStageEntry = function(baseElem) {
+var stageIdText = 
$($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  };
+
   $(this).click(function() {
-var stageIdAndAttempt = getStageIdAndAttemptForStageEntry(this);
-var stagePagePath = uiRoot + appBasePath +
-  "/stages/stage/?id=" + stageIdAndAttempt[0] + "&attempt=" + 
stageIdAndAttempt[1];
-window.location.href = stagePagePath;
+var stagePagePath = 
$(getSelectorForStageEntry(this)).find("a.name-link").attr("href")
+window.location.href = stagePagePath
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForStageEntry(getStageIdAndAttemptForStageEntry(this)))
-.addClass("corresponding-item-hover");
+  
$(getSelectorForStageEntry(this)).addClass("corresponding-item-hover");
   $($(this).find("div.job-timeline-content")[0]).tooltip("show");
 },

[spark] branch branch-2.4 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new 2516128  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
2516128 is described below

commit 2516128e466c489effc20e72d06c26a2d2f7faed
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch branch-3.0 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new 0a4b668  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
0a4b668 is described below

commit 0a4b668b62cf198358897a145f7d844614cd1931
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch branch-2.4 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new 2516128  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
2516128 is described below

commit 2516128e466c489effc20e72d06c26a2d2f7faed
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch branch-3.0 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new 0a4b668  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
0a4b668 is described below

commit 0a4b668b62cf198358897a145f7d844614cd1931
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch branch-3.0 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new 0a4b668  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
0a4b668 is described below

commit 0a4b668b62cf198358897a145f7d844614cd1931
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch master updated (c336ddf -> d01594e)

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message
 add d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-2.4 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new 2516128  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
2516128 is described below

commit 2516128e466c489effc20e72d06c26a2d2f7faed
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch master updated (c336ddf -> d01594e)

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message
 add d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-2.4 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new 2516128  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
2516128 is described below

commit 2516128e466c489effc20e72d06c26a2d2f7faed
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch branch-3.0 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new 0a4b668  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
0a4b668 is described below

commit 0a4b668b62cf198358897a145f7d844614cd1931
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch master updated (c336ddf -> d01594e)

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message
 add d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-2.4 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
 new 2516128  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
2516128 is described below

commit 2516128e466c489effc20e72d06c26a2d2f7faed
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch branch-3.0 updated: [SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new 0a4b668  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view
0a4b668 is described below

commit 0a4b668b62cf198358897a145f7d844614cd1931
Author: Zhen Li 
AuthorDate: Mon Sep 21 09:05:40 2020 -0500

[SPARK-32886][WEBUI] fix 'undefined' link in event timeline view

### What changes were proposed in this pull request?

Fix ".../jobs/undefined" link from "Event Timeline" in jobs page. Job page 
link in "Event Timeline" view is constructed by fetching job page link defined 
in job list below. when job count exceeds page size of job table, only links of 
jobs in job table can be fetched from page. Other jobs' link would be 
'undefined', and links of them in "Event Timeline" are broken, they are 
redirected to some wired URL like ".../jobs/undefined". This PR is fixing this 
wrong link issue. With this PR, jo [...]

### Why are the changes needed?

Wrong link (".../jobs/undefined") in "Event Timeline" of jobs page. for 
example, the first job in below page is not in table below, as job count(116) 
exceeds page size(100). When clicking it's item in "Event Timeline", page is 
redirected to ".../jobs/undefined", which is wrong. Links in "Event Timeline" 
should always be correct.

![undefinedlink](https://user-images.githubusercontent.com/10524738/93184779-83fa6d80-f6f1-11ea-8a80-1a304ca9cbb2.JPG)

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

Manually tested.

Closes #29757 from zhli1142015/fix-link-event-timeline-view.

Authored-by: Zhen Li 
Signed-off-by: Sean Owen 
(cherry picked from commit d01594e8d186e63a6c3ce361e756565e830d5237)
Signed-off-by: Sean Owen 
---
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)

diff --git 
a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js 
b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
index 5be8cff..220b76a 100644
--- a/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
+++ b/core/src/main/resources/org/apache/spark/ui/static/timeline-view.js
@@ -42,26 +42,31 @@ function drawApplicationTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#application-timeline-zoom-lock", applicationTimeline);
   setupExecutorEventAction();
 
+  function getIdForJobEntry(baseElem) {
+var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
+var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
+return jobId;
+  }
+
+  function getSelectorForJobEntry(jobId) {
+return "#job-" + jobId;
+  }
+
   function setupJobEventAction() {
 $(".vis-item.vis-range.job.application-timeline-object").each(function() {
-  var getSelectorForJobEntry = function(baseElem) {
-var jobIdText = 
$($(baseElem).find(".application-timeline-content")[0]).text();
-var jobId = jobIdText.match("\\(Job (\\d+)\\)$")[1];
-   return "#job-" + jobId;
-  };
-
   $(this).click(function() {
-var jobPagePath = 
$(getSelectorForJobEntry(this)).find("a.name-link").attr("href");
-  window.location.href = jobPagePath
+var jobId = getIdForJobEntry(this);
+var jobPagePath = uiRoot + appBasePath + "/jobs/job/?id=" + jobId;
+window.location.href = jobPagePath;
   });
 
   $(this).hover(
 function() {
-  $(getSelectorForJobEntry(this)).addClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).addClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("show");
 },
 function() {
-  
$(getSelectorForJobEntry(this)).removeClass("corresponding-item-hover");
+  
$(getSelectorForJobEntry(getIdForJobEntry(this))).removeClass("corresponding-item-hover");
   
$($(this).find("div.application-timeline-content")[0]).tooltip("hide");
 }
   );
@@ -125,26 +130,34 @@ function drawJobTimeline(groupArray, eventObjArray, 
startTime, offset) {
   setupZoomable("#job-timeline-zoom-lock", jobTimeline);
   setupExecutorEventAction();
 
+  function getStageIdAndAttemptForStageEntry(baseElem) {
+var stageIdText = $($(baseElem).find(".job-timeline-content")[0]).text();
+var stageIdAndAttempt = stageIdText.match("\\(Stage 
(\\d+\\.\\d+)\\)$")[1].split(".");
+return stageIdAndAttempt;
+  }
+
+  function getSelectorForStageEntry(stageIdAndAttempt) {
+return "#stage-" + stageIdAndAttempt[0] + "-" + stageIdAndAttempt[1];
+  }
+
   function setupStageEventAction() {
 $(".vis-item.vis-range.stage.job-timeline

[spark] branch master updated (c336ddf -> d01594e)

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message
 add d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (c336ddf -> d01594e)

2020-09-21 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message
 add d01594e  [SPARK-32886][WEBUI] fix 'undefined' link in event timeline 
view

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/ui/static/timeline-view.js| 53 ++
 1 file changed, 33 insertions(+), 20 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1ad1f71 -> c336ddf)

2020-09-21 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR
 add c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/catalog/interface.scala | 41 ++-
 .../sql/hive/execution/HiveTableScanSuite.scala| 61 ++
 2 files changed, 100 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1ad1f71 -> c336ddf)

2020-09-21 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR
 add c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/catalog/interface.scala | 41 ++-
 .../sql/hive/execution/HiveTableScanSuite.scala| 61 ++
 2 files changed, 100 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1ad1f71 -> c336ddf)

2020-09-21 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR
 add c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/catalog/interface.scala | 41 ++-
 .../sql/hive/execution/HiveTableScanSuite.scala| 61 ++
 2 files changed, 100 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1ad1f71 -> c336ddf)

2020-09-21 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR
 add c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/catalog/interface.scala | 41 ++-
 .../sql/hive/execution/HiveTableScanSuite.scala| 61 ++
 2 files changed, 100 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1ad1f71 -> c336ddf)

2020-09-21 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR
 add c336ddf  [SPARK-32867][SQL] When explain, HiveTableRelation show 
limited message

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/catalog/interface.scala | 41 ++-
 .../sql/hive/execution/HiveTableScanSuite.scala| 61 ++
 2 files changed, 100 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (0c66813 -> 1ad1f71)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 0c66813  Revert "[SPARK-32850][CORE] Simplify the RPC message flow of 
decommission"
 add 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/column.R  | 31 +++
 R/pkg/R/generics.R|  3 +++
 R/pkg/tests/fulltests/test_sparkSQL.R | 13 +
 4 files changed, 48 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (0c66813 -> 1ad1f71)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 0c66813  Revert "[SPARK-32850][CORE] Simplify the RPC message flow of 
decommission"
 add 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/column.R  | 31 +++
 R/pkg/R/generics.R|  3 +++
 R/pkg/tests/fulltests/test_sparkSQL.R | 13 +
 4 files changed, 48 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (0c66813 -> 1ad1f71)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 0c66813  Revert "[SPARK-32850][CORE] Simplify the RPC message flow of 
decommission"
 add 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/column.R  | 31 +++
 R/pkg/R/generics.R|  3 +++
 R/pkg/tests/fulltests/test_sparkSQL.R | 13 +
 4 files changed, 48 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (0c66813 -> 1ad1f71)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 0c66813  Revert "[SPARK-32850][CORE] Simplify the RPC message flow of 
decommission"
 add 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/column.R  | 31 +++
 R/pkg/R/generics.R|  3 +++
 R/pkg/tests/fulltests/test_sparkSQL.R | 13 +
 4 files changed, 48 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (0c66813 -> 1ad1f71)

2020-09-21 Thread gurwls223
This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 0c66813  Revert "[SPARK-32850][CORE] Simplify the RPC message flow of 
decommission"
 add 1ad1f71  [SPARK-32946][R][SQL] Add withColumn to SparkR

No new revisions were added by this update.

Summary of changes:
 R/pkg/NAMESPACE   |  1 +
 R/pkg/R/column.R  | 31 +++
 R/pkg/R/generics.R|  3 +++
 R/pkg/tests/fulltests/test_sparkSQL.R | 13 +
 4 files changed, 48 insertions(+)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org