Repository: spark
Updated Branches:
refs/heads/master 85383d29e -> 6a064ba8f
[SPARK-26141] Enable custom metrics implementation in shuffle write
## What changes were proposed in this pull request?
This is the write side counterpart to https://github.com/apache/spark/pull/23105
## How was
Repository: spark
Updated Branches:
refs/heads/master 1c487f7d1 -> 85383d29e
[SPARK-25860][SPARK-26107][FOLLOW-UP] Rule ReplaceNullWithFalseInPredicate
## What changes were proposed in this pull request?
Based on https://github.com/apache/spark/pull/22857 and
Author: pwendell
Date: Tue Nov 27 04:20:03 2018
New Revision: 31107
Log:
Apache Spark 3.0.0-SNAPSHOT-2018_11_26_20_08-c995e07 docs
[This commit notification would consist of 1756 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary.]
Repository: spark
Updated Branches:
refs/heads/master c995e0737 -> 1c487f7d1
[SPARK-24762][SQL][FOLLOWUP] Enable Option of Product encoders
## What changes were proposed in this pull request?
This is follow-up of #21732. This patch inlines `isOptionType` method.
## How was this patch
Repository: spark
Updated Branches:
refs/heads/master 9deaa726e -> c995e0737
[SPARK-26140] followup: rename ShuffleMetricsReporter
## What changes were proposed in this pull request?
In https://github.com/apache/spark/pull/23105, due to working on two parallel
PRs at once, I made the mistake
Author: pwendell
Date: Mon Nov 26 23:58:09 2018
New Revision: 31105
Log:
Apache Spark 3.0.0-SNAPSHOT-2018_11_26_15_46-9deaa72 docs
[This commit notification would consist of 1756 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary.]
Repository: spark
Updated Branches:
refs/heads/master 6f1a1c124 -> 9deaa726e
[INFRA] Close stale PR.
Closes #23107
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/9deaa726
Tree:
Repository: spark
Updated Branches:
refs/heads/master fbf62b710 -> 6f1a1c124
[SPARK-25451][HOTFIX] Call stage.attemptNumber instead of attemptId.
Closes #23149 from vanzin/SPARK-25451.hotfix.
Authored-by: Marcelo Vanzin
Signed-off-by: Marcelo Vanzin
Project:
Author: pwendell
Date: Mon Nov 26 21:37:37 2018
New Revision: 31101
Log:
Apache Spark 2.4.1-SNAPSHOT-2018_11_26_13_23-9b2b0cf docs
[This commit notification would consist of 1476 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary.]
Repository: spark
Updated Branches:
refs/heads/branch-2.4 c379611ee -> 9b2b0cf84
[SPARK-25451][SPARK-26100][CORE] Aggregated metrics table doesn't show the
right number of the total tasks
Total tasks in the aggregated table and the tasks table are not matching some
times in the WEBUI.
We
Repository: spark
Updated Branches:
refs/heads/master 76ef02e49 -> fbf62b710
[SPARK-25451][SPARK-26100][CORE] Aggregated metrics table doesn't show the
right number of the total tasks
Total tasks in the aggregated table and the tasks table are not matching some
times in the WEBUI.
We need
Repository: spark
Updated Branches:
refs/heads/master 3df307aa5 -> 76ef02e49
http://git-wip-us.apache.org/repos/asf/spark/blob/76ef02e4/core/src/test/resources/HistoryServerExpectations/stage_task_list_expectation.json
--
diff
[SPARK-21809] Change Stage Page to use datatables to support sorting columns
and searching
Support column sort, pagination and search for Stage Page using jQuery
DataTable and REST API. Before this commit, the Stage page generated a
hard-coded HTML table that could not support search.
http://git-wip-us.apache.org/repos/asf/spark/blob/76ef02e4/core/src/main/scala/org/apache/spark/status/api/v1/api.scala
--
diff --git a/core/src/main/scala/org/apache/spark/status/api/v1/api.scala
Author: pwendell
Date: Mon Nov 26 19:14:49 2018
New Revision: 31096
Log:
Apache Spark 3.0.0-SNAPSHOT-2018_11_26_11_02-2512a1d docs
[This commit notification would consist of 1756 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary.]
Repository: spark
Updated Branches:
refs/heads/master 2512a1d42 -> 3df307aa5
[SPARK-25960][K8S] Support subpath mounting with Kubernetes
## What changes were proposed in this pull request?
This PR adds configurations to use subpaths with Spark on k8s. Subpaths
Repository: spark
Updated Branches:
refs/heads/master 1bb60ab83 -> 2512a1d42
[SPARK-26121][STRUCTURED STREAMING] Allow users to define prefix of Kafka's
consumer group (group.id)
## What changes were proposed in this pull request?
Allow the Spark Structured Streaming user to specify the
Author: pwendell
Date: Mon Nov 26 13:52:00 2018
New Revision: 31091
Log:
Apache Spark 3.0.0-SNAPSHOT-2018_11_26_05_39-1bb60ab docs
[This commit notification would consist of 1756 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary.]
Repository: spark
Updated Branches:
refs/heads/master 6bb60b30f -> 1bb60ab83
[SPARK-26153][ML] GBT & RandomForest avoid unnecessary `first` job to compute
`numFeatures`
## What changes were proposed in this pull request?
use base models' `numFeature` instead of `first` job
## How was this
19 matches
Mail list logo