[spark] branch master updated (8f0fef1 -> b94c67b)

2020-08-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join
 add b94c67b  Revert "[SPARK-32511][SQL] Add dropFields method to Column 
class"

No new revisions were added by this update.

Summary of changes:
 .../catalyst/expressions/complexTypeCreator.scala  | 110 ++-
 .../sql/catalyst/optimizer/ComplexTypes.scala  |  10 +-
 .../spark/sql/catalyst/optimizer/Optimizer.scala   |   7 +-
 .../{UpdateFields.scala => WithFields.scala}   |  16 +-
 ...ldsSuite.scala => CombineWithFieldsSuite.scala} |  41 +--
 .../sql/catalyst/optimizer/complexTypesSuite.scala |  81 ++---
 .../main/scala/org/apache/spark/sql/Column.scala   |  86 ++---
 .../apache/spark/sql/ColumnExpressionSuite.scala   | 351 +
 8 files changed, 123 insertions(+), 579 deletions(-)
 rename 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/{UpdateFields.scala
 => WithFields.scala} (68%)
 rename 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/{CombineUpdateFieldsSuite.scala
 => CombineWithFieldsSuite.scala} (65%)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (8f0fef1 -> b94c67b)

2020-08-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join
 add b94c67b  Revert "[SPARK-32511][SQL] Add dropFields method to Column 
class"

No new revisions were added by this update.

Summary of changes:
 .../catalyst/expressions/complexTypeCreator.scala  | 110 ++-
 .../sql/catalyst/optimizer/ComplexTypes.scala  |  10 +-
 .../spark/sql/catalyst/optimizer/Optimizer.scala   |   7 +-
 .../{UpdateFields.scala => WithFields.scala}   |  16 +-
 ...ldsSuite.scala => CombineWithFieldsSuite.scala} |  41 +--
 .../sql/catalyst/optimizer/complexTypesSuite.scala |  81 ++---
 .../main/scala/org/apache/spark/sql/Column.scala   |  86 ++---
 .../apache/spark/sql/ColumnExpressionSuite.scala   | 351 +
 8 files changed, 123 insertions(+), 579 deletions(-)
 rename 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/{UpdateFields.scala
 => WithFields.scala} (68%)
 rename 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/{CombineUpdateFieldsSuite.scala
 => CombineWithFieldsSuite.scala} (65%)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (8f0fef1 -> b94c67b)

2020-08-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join
 add b94c67b  Revert "[SPARK-32511][SQL] Add dropFields method to Column 
class"

No new revisions were added by this update.

Summary of changes:
 .../catalyst/expressions/complexTypeCreator.scala  | 110 ++-
 .../sql/catalyst/optimizer/ComplexTypes.scala  |  10 +-
 .../spark/sql/catalyst/optimizer/Optimizer.scala   |   7 +-
 .../{UpdateFields.scala => WithFields.scala}   |  16 +-
 ...ldsSuite.scala => CombineWithFieldsSuite.scala} |  41 +--
 .../sql/catalyst/optimizer/complexTypesSuite.scala |  81 ++---
 .../main/scala/org/apache/spark/sql/Column.scala   |  86 ++---
 .../apache/spark/sql/ColumnExpressionSuite.scala   | 351 +
 8 files changed, 123 insertions(+), 579 deletions(-)
 rename 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/{UpdateFields.scala
 => WithFields.scala} (68%)
 rename 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/{CombineUpdateFieldsSuite.scala
 => CombineWithFieldsSuite.scala} (65%)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (8f0fef1 -> b94c67b)

2020-08-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join
 add b94c67b  Revert "[SPARK-32511][SQL] Add dropFields method to Column 
class"

No new revisions were added by this update.

Summary of changes:
 .../catalyst/expressions/complexTypeCreator.scala  | 110 ++-
 .../sql/catalyst/optimizer/ComplexTypes.scala  |  10 +-
 .../spark/sql/catalyst/optimizer/Optimizer.scala   |   7 +-
 .../{UpdateFields.scala => WithFields.scala}   |  16 +-
 ...ldsSuite.scala => CombineWithFieldsSuite.scala} |  41 +--
 .../sql/catalyst/optimizer/complexTypesSuite.scala |  81 ++---
 .../main/scala/org/apache/spark/sql/Column.scala   |  86 ++---
 .../apache/spark/sql/ColumnExpressionSuite.scala   | 351 +
 8 files changed, 123 insertions(+), 579 deletions(-)
 rename 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/{UpdateFields.scala
 => WithFields.scala} (68%)
 rename 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/{CombineUpdateFieldsSuite.scala
 => CombineWithFieldsSuite.scala} (65%)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (8f0fef1 -> b94c67b)

2020-08-16 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join
 add b94c67b  Revert "[SPARK-32511][SQL] Add dropFields method to Column 
class"

No new revisions were added by this update.

Summary of changes:
 .../catalyst/expressions/complexTypeCreator.scala  | 110 ++-
 .../sql/catalyst/optimizer/ComplexTypes.scala  |  10 +-
 .../spark/sql/catalyst/optimizer/Optimizer.scala   |   7 +-
 .../{UpdateFields.scala => WithFields.scala}   |  16 +-
 ...ldsSuite.scala => CombineWithFieldsSuite.scala} |  41 +--
 .../sql/catalyst/optimizer/complexTypesSuite.scala |  81 ++---
 .../main/scala/org/apache/spark/sql/Column.scala   |  86 ++---
 .../apache/spark/sql/ColumnExpressionSuite.scala   | 351 +
 8 files changed, 123 insertions(+), 579 deletions(-)
 rename 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/{UpdateFields.scala
 => WithFields.scala} (68%)
 rename 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/{CombineUpdateFieldsSuite.scala
 => CombineWithFieldsSuite.scala} (65%)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (9a79bbc -> 8f0fef1)

2020-08-16 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
 add 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/unsafe/map/BytesToBytesMap.java   |  70 ++
 .../unsafe/map/AbstractBytesToBytesMapSuite.java   |  27 ++-
 .../spark/sql/catalyst/optimizer/joins.scala   |  27 ++-
 .../org/apache/spark/sql/internal/SQLConf.scala|   5 +-
 .../spark/sql/execution/SparkStrategies.scala  |   6 +-
 .../spark/sql/execution/joins/HashJoin.scala   |   4 +-
 .../spark/sql/execution/joins/HashedRelation.scala | 175 ++-
 .../sql/execution/joins/ShuffledHashJoinExec.scala | 239 -
 .../spark/sql/execution/joins/ShuffledJoin.scala   |  23 +-
 .../sql/execution/joins/SortMergeJoinExec.scala|  20 --
 .../scala/org/apache/spark/sql/JoinSuite.scala |  66 ++
 .../sql/execution/joins/HashedRelationSuite.scala  |  79 +++
 12 files changed, 693 insertions(+), 48 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (9a79bbc -> 8f0fef1)

2020-08-16 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
 add 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/unsafe/map/BytesToBytesMap.java   |  70 ++
 .../unsafe/map/AbstractBytesToBytesMapSuite.java   |  27 ++-
 .../spark/sql/catalyst/optimizer/joins.scala   |  27 ++-
 .../org/apache/spark/sql/internal/SQLConf.scala|   5 +-
 .../spark/sql/execution/SparkStrategies.scala  |   6 +-
 .../spark/sql/execution/joins/HashJoin.scala   |   4 +-
 .../spark/sql/execution/joins/HashedRelation.scala | 175 ++-
 .../sql/execution/joins/ShuffledHashJoinExec.scala | 239 -
 .../spark/sql/execution/joins/ShuffledJoin.scala   |  23 +-
 .../sql/execution/joins/SortMergeJoinExec.scala|  20 --
 .../scala/org/apache/spark/sql/JoinSuite.scala |  66 ++
 .../sql/execution/joins/HashedRelationSuite.scala  |  79 +++
 12 files changed, 693 insertions(+), 48 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (9a79bbc -> 8f0fef1)

2020-08-16 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
 add 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/unsafe/map/BytesToBytesMap.java   |  70 ++
 .../unsafe/map/AbstractBytesToBytesMapSuite.java   |  27 ++-
 .../spark/sql/catalyst/optimizer/joins.scala   |  27 ++-
 .../org/apache/spark/sql/internal/SQLConf.scala|   5 +-
 .../spark/sql/execution/SparkStrategies.scala  |   6 +-
 .../spark/sql/execution/joins/HashJoin.scala   |   4 +-
 .../spark/sql/execution/joins/HashedRelation.scala | 175 ++-
 .../sql/execution/joins/ShuffledHashJoinExec.scala | 239 -
 .../spark/sql/execution/joins/ShuffledJoin.scala   |  23 +-
 .../sql/execution/joins/SortMergeJoinExec.scala|  20 --
 .../scala/org/apache/spark/sql/JoinSuite.scala |  66 ++
 .../sql/execution/joins/HashedRelationSuite.scala  |  79 +++
 12 files changed, 693 insertions(+), 48 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (9a79bbc -> 8f0fef1)

2020-08-16 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
 add 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/unsafe/map/BytesToBytesMap.java   |  70 ++
 .../unsafe/map/AbstractBytesToBytesMapSuite.java   |  27 ++-
 .../spark/sql/catalyst/optimizer/joins.scala   |  27 ++-
 .../org/apache/spark/sql/internal/SQLConf.scala|   5 +-
 .../spark/sql/execution/SparkStrategies.scala  |   6 +-
 .../spark/sql/execution/joins/HashJoin.scala   |   4 +-
 .../spark/sql/execution/joins/HashedRelation.scala | 175 ++-
 .../sql/execution/joins/ShuffledHashJoinExec.scala | 239 -
 .../spark/sql/execution/joins/ShuffledJoin.scala   |  23 +-
 .../sql/execution/joins/SortMergeJoinExec.scala|  20 --
 .../scala/org/apache/spark/sql/JoinSuite.scala |  66 ++
 .../sql/execution/joins/HashedRelationSuite.scala  |  79 +++
 12 files changed, 693 insertions(+), 48 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (9a79bbc -> 8f0fef1)

2020-08-16 Thread yamamuro
This is an automated email from the ASF dual-hosted git repository.

yamamuro pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
 add 8f0fef1  [SPARK-32399][SQL] Full outer shuffled hash join

No new revisions were added by this update.

Summary of changes:
 .../apache/spark/unsafe/map/BytesToBytesMap.java   |  70 ++
 .../unsafe/map/AbstractBytesToBytesMapSuite.java   |  27 ++-
 .../spark/sql/catalyst/optimizer/joins.scala   |  27 ++-
 .../org/apache/spark/sql/internal/SQLConf.scala|   5 +-
 .../spark/sql/execution/SparkStrategies.scala  |   6 +-
 .../spark/sql/execution/joins/HashJoin.scala   |   4 +-
 .../spark/sql/execution/joins/HashedRelation.scala | 175 ++-
 .../sql/execution/joins/ShuffledHashJoinExec.scala | 239 -
 .../spark/sql/execution/joins/ShuffledJoin.scala   |  23 +-
 .../sql/execution/joins/SortMergeJoinExec.scala|  20 --
 .../scala/org/apache/spark/sql/JoinSuite.scala |  66 ++
 .../sql/execution/joins/HashedRelationSuite.scala  |  79 +++
 12 files changed, 693 insertions(+), 48 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.0 updated: [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md to refer the proper version

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new c4807ce  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
c4807ce is described below

commit c4807ced3913a4d524892dc7bab502250687a43c
Author: Kousuke Saruta 
AuthorDate: Sun Aug 16 12:07:37 2020 -0500

[SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md 
to refer the proper version

### What changes were proposed in this pull request?

This PR fixes the link to metrics.dropwizard.io in monitoring.md to refer 
the proper version of the library.

### Why are the changes needed?

There are links to metrics.dropwizard.io in monitoring.md but the link 
targets refer the version 3.1.0, while we use 4.1.1.
Now that users can create their own metrics using the dropwizard library, 
it's better to fix the links to refer the proper version.

### Does this PR introduce _any_ user-facing change?

Yes. The modified links refer the version 4.1.1.

### How was this patch tested?

Build the docs and visit all the modified links.

Closes #29426 from sarutak/fix-dropwizard-url.

Authored-by: Kousuke Saruta 
Signed-off-by: Sean Owen 
(cherry picked from commit 9a79bbc8b6e426e7b29a9f4867beb396014d8046)
Signed-off-by: Sean Owen 
---
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)

diff --git a/docs/monitoring.md b/docs/monitoring.md
index 1808167..4608a4e 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -718,7 +718,7 @@ The JSON end point is exposed at: 
`/applications/[app-id]/executors`, and the Pr
 The Prometheus endpoint is experimental and conditional to a configuration 
parameter: `spark.ui.prometheus.enabled=true` (the default is `false`).
 In addition, aggregated per-stage peak values of the executor memory metrics 
are written to the event log if
 `spark.eventLog.logStageExecutorMetrics` is true.  
-Executor memory metrics are also exposed via the Spark metrics system based on 
the Dropwizard metrics library.
+Executor memory metrics are also exposed via the Spark metrics system based on 
the [Dropwizard metrics library](http://metrics.dropwizard.io/4.1.1).
 A list of the available metrics, with a short description:
 
 
@@ -922,7 +922,7 @@ keep the paths consistent in both modes.
 # Metrics
 
 Spark has a configurable metrics system based on the
-[Dropwizard Metrics Library](http://metrics.dropwizard.io/).
+[Dropwizard Metrics Library](http://metrics.dropwizard.io/4.1.1).
 This allows users to report Spark metrics to a variety of sinks including 
HTTP, JMX, and CSV
 files. The metrics are generated by sources embedded in the Spark code base. 
They
 provide instrumentation for specific activities and Spark components.
@@ -1016,7 +1016,7 @@ activates the JVM source:
 ## List of available metrics providers 
 
 Metrics used by Spark are of multiple types: gauge, counter, histogram, meter 
and timer, 
-see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/3.1.0/getting-started/).
+see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/4.1.1/getting-started.html).
 The following list of components and metrics reports the name and some details 
about the available metrics,
 grouped per component instance and source namespace.
 The most common time of metrics used in Spark instrumentation are gauges and 
counters. 
@@ -1244,7 +1244,7 @@ Notes:
 `spark.metrics.staticSources.enabled` (default is true)
   - This source is available for driver and executor instances and is also 
available for other instances.  
   - This source provides information on JVM metrics using the 
-  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/3.1.0/manual/jvm/)
+  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/4.1.1/manual/jvm.html)
and in particular the metric sets BufferPoolMetricSet, 
GarbageCollectorMetricSet and MemoryUsageGaugeSet. 
 
 ### Component instance = applicationMaster
diff --git a/pom.xml b/pom.xml
index e9ae204..1bf5de0 100644
--- a/pom.xml
+++ b/pom.xml
@@ -145,6 +145,10 @@
 0.9.5
 2.4.0
 2.0.8
+
 4.1.1
 1.8.2
 hadoop2


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.0 updated: [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md to refer the proper version

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new c4807ce  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
c4807ce is described below

commit c4807ced3913a4d524892dc7bab502250687a43c
Author: Kousuke Saruta 
AuthorDate: Sun Aug 16 12:07:37 2020 -0500

[SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md 
to refer the proper version

### What changes were proposed in this pull request?

This PR fixes the link to metrics.dropwizard.io in monitoring.md to refer 
the proper version of the library.

### Why are the changes needed?

There are links to metrics.dropwizard.io in monitoring.md but the link 
targets refer the version 3.1.0, while we use 4.1.1.
Now that users can create their own metrics using the dropwizard library, 
it's better to fix the links to refer the proper version.

### Does this PR introduce _any_ user-facing change?

Yes. The modified links refer the version 4.1.1.

### How was this patch tested?

Build the docs and visit all the modified links.

Closes #29426 from sarutak/fix-dropwizard-url.

Authored-by: Kousuke Saruta 
Signed-off-by: Sean Owen 
(cherry picked from commit 9a79bbc8b6e426e7b29a9f4867beb396014d8046)
Signed-off-by: Sean Owen 
---
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)

diff --git a/docs/monitoring.md b/docs/monitoring.md
index 1808167..4608a4e 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -718,7 +718,7 @@ The JSON end point is exposed at: 
`/applications/[app-id]/executors`, and the Pr
 The Prometheus endpoint is experimental and conditional to a configuration 
parameter: `spark.ui.prometheus.enabled=true` (the default is `false`).
 In addition, aggregated per-stage peak values of the executor memory metrics 
are written to the event log if
 `spark.eventLog.logStageExecutorMetrics` is true.  
-Executor memory metrics are also exposed via the Spark metrics system based on 
the Dropwizard metrics library.
+Executor memory metrics are also exposed via the Spark metrics system based on 
the [Dropwizard metrics library](http://metrics.dropwizard.io/4.1.1).
 A list of the available metrics, with a short description:
 
 
@@ -922,7 +922,7 @@ keep the paths consistent in both modes.
 # Metrics
 
 Spark has a configurable metrics system based on the
-[Dropwizard Metrics Library](http://metrics.dropwizard.io/).
+[Dropwizard Metrics Library](http://metrics.dropwizard.io/4.1.1).
 This allows users to report Spark metrics to a variety of sinks including 
HTTP, JMX, and CSV
 files. The metrics are generated by sources embedded in the Spark code base. 
They
 provide instrumentation for specific activities and Spark components.
@@ -1016,7 +1016,7 @@ activates the JVM source:
 ## List of available metrics providers 
 
 Metrics used by Spark are of multiple types: gauge, counter, histogram, meter 
and timer, 
-see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/3.1.0/getting-started/).
+see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/4.1.1/getting-started.html).
 The following list of components and metrics reports the name and some details 
about the available metrics,
 grouped per component instance and source namespace.
 The most common time of metrics used in Spark instrumentation are gauges and 
counters. 
@@ -1244,7 +1244,7 @@ Notes:
 `spark.metrics.staticSources.enabled` (default is true)
   - This source is available for driver and executor instances and is also 
available for other instances.  
   - This source provides information on JVM metrics using the 
-  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/3.1.0/manual/jvm/)
+  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/4.1.1/manual/jvm.html)
and in particular the metric sets BufferPoolMetricSet, 
GarbageCollectorMetricSet and MemoryUsageGaugeSet. 
 
 ### Component instance = applicationMaster
diff --git a/pom.xml b/pom.xml
index e9ae204..1bf5de0 100644
--- a/pom.xml
+++ b/pom.xml
@@ -145,6 +145,10 @@
 0.9.5
 2.4.0
 2.0.8
+
 4.1.1
 1.8.2
 hadoop2


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (c280c7f -> 9a79bbc)

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c280c7f  [SPARK-32625][SQL] Log error message when falling back to 
interpreter mode
 add 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version

No new revisions were added by this update.

Summary of changes:
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.0 updated: [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md to refer the proper version

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new c4807ce  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
c4807ce is described below

commit c4807ced3913a4d524892dc7bab502250687a43c
Author: Kousuke Saruta 
AuthorDate: Sun Aug 16 12:07:37 2020 -0500

[SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md 
to refer the proper version

### What changes were proposed in this pull request?

This PR fixes the link to metrics.dropwizard.io in monitoring.md to refer 
the proper version of the library.

### Why are the changes needed?

There are links to metrics.dropwizard.io in monitoring.md but the link 
targets refer the version 3.1.0, while we use 4.1.1.
Now that users can create their own metrics using the dropwizard library, 
it's better to fix the links to refer the proper version.

### Does this PR introduce _any_ user-facing change?

Yes. The modified links refer the version 4.1.1.

### How was this patch tested?

Build the docs and visit all the modified links.

Closes #29426 from sarutak/fix-dropwizard-url.

Authored-by: Kousuke Saruta 
Signed-off-by: Sean Owen 
(cherry picked from commit 9a79bbc8b6e426e7b29a9f4867beb396014d8046)
Signed-off-by: Sean Owen 
---
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)

diff --git a/docs/monitoring.md b/docs/monitoring.md
index 1808167..4608a4e 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -718,7 +718,7 @@ The JSON end point is exposed at: 
`/applications/[app-id]/executors`, and the Pr
 The Prometheus endpoint is experimental and conditional to a configuration 
parameter: `spark.ui.prometheus.enabled=true` (the default is `false`).
 In addition, aggregated per-stage peak values of the executor memory metrics 
are written to the event log if
 `spark.eventLog.logStageExecutorMetrics` is true.  
-Executor memory metrics are also exposed via the Spark metrics system based on 
the Dropwizard metrics library.
+Executor memory metrics are also exposed via the Spark metrics system based on 
the [Dropwizard metrics library](http://metrics.dropwizard.io/4.1.1).
 A list of the available metrics, with a short description:
 
 
@@ -922,7 +922,7 @@ keep the paths consistent in both modes.
 # Metrics
 
 Spark has a configurable metrics system based on the
-[Dropwizard Metrics Library](http://metrics.dropwizard.io/).
+[Dropwizard Metrics Library](http://metrics.dropwizard.io/4.1.1).
 This allows users to report Spark metrics to a variety of sinks including 
HTTP, JMX, and CSV
 files. The metrics are generated by sources embedded in the Spark code base. 
They
 provide instrumentation for specific activities and Spark components.
@@ -1016,7 +1016,7 @@ activates the JVM source:
 ## List of available metrics providers 
 
 Metrics used by Spark are of multiple types: gauge, counter, histogram, meter 
and timer, 
-see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/3.1.0/getting-started/).
+see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/4.1.1/getting-started.html).
 The following list of components and metrics reports the name and some details 
about the available metrics,
 grouped per component instance and source namespace.
 The most common time of metrics used in Spark instrumentation are gauges and 
counters. 
@@ -1244,7 +1244,7 @@ Notes:
 `spark.metrics.staticSources.enabled` (default is true)
   - This source is available for driver and executor instances and is also 
available for other instances.  
   - This source provides information on JVM metrics using the 
-  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/3.1.0/manual/jvm/)
+  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/4.1.1/manual/jvm.html)
and in particular the metric sets BufferPoolMetricSet, 
GarbageCollectorMetricSet and MemoryUsageGaugeSet. 
 
 ### Component instance = applicationMaster
diff --git a/pom.xml b/pom.xml
index e9ae204..1bf5de0 100644
--- a/pom.xml
+++ b/pom.xml
@@ -145,6 +145,10 @@
 0.9.5
 2.4.0
 2.0.8
+
 4.1.1
 1.8.2
 hadoop2


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (c280c7f -> 9a79bbc)

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c280c7f  [SPARK-32625][SQL] Log error message when falling back to 
interpreter mode
 add 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version

No new revisions were added by this update.

Summary of changes:
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.0 updated: [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md to refer the proper version

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new c4807ce  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
c4807ce is described below

commit c4807ced3913a4d524892dc7bab502250687a43c
Author: Kousuke Saruta 
AuthorDate: Sun Aug 16 12:07:37 2020 -0500

[SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md 
to refer the proper version

### What changes were proposed in this pull request?

This PR fixes the link to metrics.dropwizard.io in monitoring.md to refer 
the proper version of the library.

### Why are the changes needed?

There are links to metrics.dropwizard.io in monitoring.md but the link 
targets refer the version 3.1.0, while we use 4.1.1.
Now that users can create their own metrics using the dropwizard library, 
it's better to fix the links to refer the proper version.

### Does this PR introduce _any_ user-facing change?

Yes. The modified links refer the version 4.1.1.

### How was this patch tested?

Build the docs and visit all the modified links.

Closes #29426 from sarutak/fix-dropwizard-url.

Authored-by: Kousuke Saruta 
Signed-off-by: Sean Owen 
(cherry picked from commit 9a79bbc8b6e426e7b29a9f4867beb396014d8046)
Signed-off-by: Sean Owen 
---
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)

diff --git a/docs/monitoring.md b/docs/monitoring.md
index 1808167..4608a4e 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -718,7 +718,7 @@ The JSON end point is exposed at: 
`/applications/[app-id]/executors`, and the Pr
 The Prometheus endpoint is experimental and conditional to a configuration 
parameter: `spark.ui.prometheus.enabled=true` (the default is `false`).
 In addition, aggregated per-stage peak values of the executor memory metrics 
are written to the event log if
 `spark.eventLog.logStageExecutorMetrics` is true.  
-Executor memory metrics are also exposed via the Spark metrics system based on 
the Dropwizard metrics library.
+Executor memory metrics are also exposed via the Spark metrics system based on 
the [Dropwizard metrics library](http://metrics.dropwizard.io/4.1.1).
 A list of the available metrics, with a short description:
 
 
@@ -922,7 +922,7 @@ keep the paths consistent in both modes.
 # Metrics
 
 Spark has a configurable metrics system based on the
-[Dropwizard Metrics Library](http://metrics.dropwizard.io/).
+[Dropwizard Metrics Library](http://metrics.dropwizard.io/4.1.1).
 This allows users to report Spark metrics to a variety of sinks including 
HTTP, JMX, and CSV
 files. The metrics are generated by sources embedded in the Spark code base. 
They
 provide instrumentation for specific activities and Spark components.
@@ -1016,7 +1016,7 @@ activates the JVM source:
 ## List of available metrics providers 
 
 Metrics used by Spark are of multiple types: gauge, counter, histogram, meter 
and timer, 
-see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/3.1.0/getting-started/).
+see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/4.1.1/getting-started.html).
 The following list of components and metrics reports the name and some details 
about the available metrics,
 grouped per component instance and source namespace.
 The most common time of metrics used in Spark instrumentation are gauges and 
counters. 
@@ -1244,7 +1244,7 @@ Notes:
 `spark.metrics.staticSources.enabled` (default is true)
   - This source is available for driver and executor instances and is also 
available for other instances.  
   - This source provides information on JVM metrics using the 
-  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/3.1.0/manual/jvm/)
+  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/4.1.1/manual/jvm.html)
and in particular the metric sets BufferPoolMetricSet, 
GarbageCollectorMetricSet and MemoryUsageGaugeSet. 
 
 ### Component instance = applicationMaster
diff --git a/pom.xml b/pom.xml
index e9ae204..1bf5de0 100644
--- a/pom.xml
+++ b/pom.xml
@@ -145,6 +145,10 @@
 0.9.5
 2.4.0
 2.0.8
+
 4.1.1
 1.8.2
 hadoop2


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (c280c7f -> 9a79bbc)

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c280c7f  [SPARK-32625][SQL] Log error message when falling back to 
interpreter mode
 add 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version

No new revisions were added by this update.

Summary of changes:
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch branch-3.0 updated: [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md to refer the proper version

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
 new c4807ce  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
c4807ce is described below

commit c4807ced3913a4d524892dc7bab502250687a43c
Author: Kousuke Saruta 
AuthorDate: Sun Aug 16 12:07:37 2020 -0500

[SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md 
to refer the proper version

### What changes were proposed in this pull request?

This PR fixes the link to metrics.dropwizard.io in monitoring.md to refer 
the proper version of the library.

### Why are the changes needed?

There are links to metrics.dropwizard.io in monitoring.md but the link 
targets refer the version 3.1.0, while we use 4.1.1.
Now that users can create their own metrics using the dropwizard library, 
it's better to fix the links to refer the proper version.

### Does this PR introduce _any_ user-facing change?

Yes. The modified links refer the version 4.1.1.

### How was this patch tested?

Build the docs and visit all the modified links.

Closes #29426 from sarutak/fix-dropwizard-url.

Authored-by: Kousuke Saruta 
Signed-off-by: Sean Owen 
(cherry picked from commit 9a79bbc8b6e426e7b29a9f4867beb396014d8046)
Signed-off-by: Sean Owen 
---
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)

diff --git a/docs/monitoring.md b/docs/monitoring.md
index 1808167..4608a4e 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -718,7 +718,7 @@ The JSON end point is exposed at: 
`/applications/[app-id]/executors`, and the Pr
 The Prometheus endpoint is experimental and conditional to a configuration 
parameter: `spark.ui.prometheus.enabled=true` (the default is `false`).
 In addition, aggregated per-stage peak values of the executor memory metrics 
are written to the event log if
 `spark.eventLog.logStageExecutorMetrics` is true.  
-Executor memory metrics are also exposed via the Spark metrics system based on 
the Dropwizard metrics library.
+Executor memory metrics are also exposed via the Spark metrics system based on 
the [Dropwizard metrics library](http://metrics.dropwizard.io/4.1.1).
 A list of the available metrics, with a short description:
 
 
@@ -922,7 +922,7 @@ keep the paths consistent in both modes.
 # Metrics
 
 Spark has a configurable metrics system based on the
-[Dropwizard Metrics Library](http://metrics.dropwizard.io/).
+[Dropwizard Metrics Library](http://metrics.dropwizard.io/4.1.1).
 This allows users to report Spark metrics to a variety of sinks including 
HTTP, JMX, and CSV
 files. The metrics are generated by sources embedded in the Spark code base. 
They
 provide instrumentation for specific activities and Spark components.
@@ -1016,7 +1016,7 @@ activates the JVM source:
 ## List of available metrics providers 
 
 Metrics used by Spark are of multiple types: gauge, counter, histogram, meter 
and timer, 
-see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/3.1.0/getting-started/).
+see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/4.1.1/getting-started.html).
 The following list of components and metrics reports the name and some details 
about the available metrics,
 grouped per component instance and source namespace.
 The most common time of metrics used in Spark instrumentation are gauges and 
counters. 
@@ -1244,7 +1244,7 @@ Notes:
 `spark.metrics.staticSources.enabled` (default is true)
   - This source is available for driver and executor instances and is also 
available for other instances.  
   - This source provides information on JVM metrics using the 
-  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/3.1.0/manual/jvm/)
+  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/4.1.1/manual/jvm.html)
and in particular the metric sets BufferPoolMetricSet, 
GarbageCollectorMetricSet and MemoryUsageGaugeSet. 
 
 ### Component instance = applicationMaster
diff --git a/pom.xml b/pom.xml
index e9ae204..1bf5de0 100644
--- a/pom.xml
+++ b/pom.xml
@@ -145,6 +145,10 @@
 0.9.5
 2.4.0
 2.0.8
+
 4.1.1
 1.8.2
 hadoop2


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (c280c7f -> 9a79bbc)

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from c280c7f  [SPARK-32625][SQL] Log error message when falling back to 
interpreter mode
 add 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version

No new revisions were added by this update.

Summary of changes:
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md to refer the proper version

2020-08-16 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 9a79bbc  [SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in 
monitoring.md to refer the proper version
9a79bbc is described below

commit 9a79bbc8b6e426e7b29a9f4867beb396014d8046
Author: Kousuke Saruta 
AuthorDate: Sun Aug 16 12:07:37 2020 -0500

[SPARK-32610][DOCS] Fix the link to metrics.dropwizard.io in monitoring.md 
to refer the proper version

### What changes were proposed in this pull request?

This PR fixes the link to metrics.dropwizard.io in monitoring.md to refer 
the proper version of the library.

### Why are the changes needed?

There are links to metrics.dropwizard.io in monitoring.md but the link 
targets refer the version 3.1.0, while we use 4.1.1.
Now that users can create their own metrics using the dropwizard library, 
it's better to fix the links to refer the proper version.

### Does this PR introduce _any_ user-facing change?

Yes. The modified links refer the version 4.1.1.

### How was this patch tested?

Build the docs and visit all the modified links.

Closes #29426 from sarutak/fix-dropwizard-url.

Authored-by: Kousuke Saruta 
Signed-off-by: Sean Owen 
---
 docs/monitoring.md | 8 
 pom.xml| 4 
 2 files changed, 8 insertions(+), 4 deletions(-)

diff --git a/docs/monitoring.md b/docs/monitoring.md
index 5fdf308..31fc160 100644
--- a/docs/monitoring.md
+++ b/docs/monitoring.md
@@ -758,7 +758,7 @@ The JSON end point is exposed at: 
`/applications/[app-id]/executors`, and the Pr
 The Prometheus endpoint is experimental and conditional to a configuration 
parameter: `spark.ui.prometheus.enabled=true` (the default is `false`).
 In addition, aggregated per-stage peak values of the executor memory metrics 
are written to the event log if
 `spark.eventLog.logStageExecutorMetrics` is true.  
-Executor memory metrics are also exposed via the Spark metrics system based on 
the Dropwizard metrics library.
+Executor memory metrics are also exposed via the Spark metrics system based on 
the [Dropwizard metrics library](http://metrics.dropwizard.io/4.1.1).
 A list of the available metrics, with a short description:
 
 
@@ -962,7 +962,7 @@ keep the paths consistent in both modes.
 # Metrics
 
 Spark has a configurable metrics system based on the
-[Dropwizard Metrics Library](http://metrics.dropwizard.io/).
+[Dropwizard Metrics Library](http://metrics.dropwizard.io/4.1.1).
 This allows users to report Spark metrics to a variety of sinks including 
HTTP, JMX, and CSV
 files. The metrics are generated by sources embedded in the Spark code base. 
They
 provide instrumentation for specific activities and Spark components.
@@ -1056,7 +1056,7 @@ activates the JVM source:
 ## List of available metrics providers 
 
 Metrics used by Spark are of multiple types: gauge, counter, histogram, meter 
and timer, 
-see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/3.1.0/getting-started/).
+see [Dropwizard library documentation for 
details](https://metrics.dropwizard.io/4.1.1/getting-started.html).
 The following list of components and metrics reports the name and some details 
about the available metrics,
 grouped per component instance and source namespace.
 The most common time of metrics used in Spark instrumentation are gauges and 
counters. 
@@ -1284,7 +1284,7 @@ Notes:
 `spark.metrics.staticSources.enabled` (default is true)
   - This source is available for driver and executor instances and is also 
available for other instances.  
   - This source provides information on JVM metrics using the 
-  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/3.1.0/manual/jvm/)
+  [Dropwizard/Codahale Metric Sets for JVM 
instrumentation](https://metrics.dropwizard.io/4.1.1/manual/jvm.html)
and in particular the metric sets BufferPoolMetricSet, 
GarbageCollectorMetricSet and MemoryUsageGaugeSet. 
 
 ### Component instance = applicationMaster
diff --git a/pom.xml b/pom.xml
index e414835..23de569 100644
--- a/pom.xml
+++ b/pom.xml
@@ -145,6 +145,10 @@
 0.9.5
 2.4.0
 2.0.8
+
 4.1.1
 1.8.2
 hadoop2


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r40991 - in /dev/spark/v3.0.1-rc1-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/parqu

2020-08-16 Thread ruifengz
Author: ruifengz
Date: Sun Aug 16 07:27:24 2020
New Revision: 40991

Log:
Apache Spark v3.0.1-rc1 docs


[This commit notification would consist of 1923 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r40990 - /dev/spark/v3.0.1-rc1-bin/

2020-08-16 Thread ruifengz
Author: ruifengz
Date: Sun Aug 16 06:22:52 2020
New Revision: 40990

Log:
Apache Spark v3.0.1-rc1

Added:
dev/spark/v3.0.1-rc1-bin/
dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz   (with props)
dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz.asc
dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz.sha512
dev/spark/v3.0.1-rc1-bin/pyspark-3.0.1.tar.gz   (with props)
dev/spark/v3.0.1-rc1-bin/pyspark-3.0.1.tar.gz.asc
dev/spark/v3.0.1-rc1-bin/pyspark-3.0.1.tar.gz.sha512
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop2.7-hive1.2.tgz   (with 
props)
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop2.7-hive1.2.tgz.asc
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop2.7-hive1.2.tgz.sha512
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop2.7.tgz   (with props)
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop2.7.tgz.asc
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop2.7.tgz.sha512
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop3.2.tgz   (with props)
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop3.2.tgz.asc
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-hadoop3.2.tgz.sha512
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-without-hadoop.tgz   (with props)
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-without-hadoop.tgz.asc
dev/spark/v3.0.1-rc1-bin/spark-3.0.1-bin-without-hadoop.tgz.sha512
dev/spark/v3.0.1-rc1-bin/spark-3.0.1.tgz   (with props)
dev/spark/v3.0.1-rc1-bin/spark-3.0.1.tgz.asc
dev/spark/v3.0.1-rc1-bin/spark-3.0.1.tgz.sha512

Added: dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz.asc
==
--- dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz.asc (added)
+++ dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz.asc Sun Aug 16 06:22:52 2020
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+
+iQJIBAABCgAyFiEEUUb73EuQdE6pSANXleDuOM+Y+fQFAl84v78UHHJ1aWZlbmd6
+QGFwYWNoZS5vcmcACgkQleDuOM+Y+fS+ERAAraGrdu/MwQF+Cp8JqK00z0scBgTl
+e9H6eDHMSxphMNRsnzKdhrfInNRcoH0sovksy2CTvNINNBlFnMChXDsLvt+ArBSr
+F+3yei7rxtiTl2EaRw71QDXiXV9Bf2GTENPwO2PWeuO9FiLFmJ/BxM6NjC38oFUZ
+pKy5Wttbhg+xerp6geCe0OuvqspeukkyyTrb6UXoWcVprOxJZBfBBbHiG0dZ0XzU
+IbRlerx0SDbb7fzH1PI2RzSma3r9yYGUZKaleimauHLlSxySA32KrQPZRf/454cZ
+QC4skHpu76rZjQhznaRzSKY6BbqAuCsHogM9O535q0nIUwtbdEJ/09Vr/uNMY6P9
+FOzvkRgV18WGeXaT/qJKwdQJqnFCx5/TCDZxJFlh99UDKHj4vq4J0tt2Iru5X7Eo
+EcuGHkTxWFWZNTvJFodMUQlHzPZoXrmsJYNOh1AkNbsOQ/Gl4DFetoKHUJWgCMqF
+3X97cmNPvJk2SjgMP9U81eeHM3+q7uvAXgvfS/rj/JXt7FEQGBTBpVrf6phYYU8y
+AVyShost5WSIPu5t90xX9Y97MsardX+f1hWBD9JRo/mvxGbnYnB/bxaNtwmW9FqH
+yHpXvs2I1hNXiHd19e+zkdOxtWFbdcZ8CGBq6/VlYirVNyf2gOtmflajUkRAvh2M
+PUf2bs+Jl5eROro=
+=XzIk
+-END PGP SIGNATURE-

Added: dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz.sha512
==
--- dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz.sha512 (added)
+++ dev/spark/v3.0.1-rc1-bin/SparkR_3.0.1.tar.gz.sha512 Sun Aug 16 06:22:52 2020
@@ -0,0 +1,3 @@
+SparkR_3.0.1.tar.gz: 15ADB5A1 1CAAF630 8DB8AC2A 02FC13CD 53E31AE4 4463DDC2
+ 1F49D097 7AB2F06F 9F907B53 7CE47992 586C1852 69EB01A1
+ EE06E194 13A2FED6 416F9A97 48E836C4

Added: dev/spark/v3.0.1-rc1-bin/pyspark-3.0.1.tar.gz
==
Binary file - no diff available.

Propchange: dev/spark/v3.0.1-rc1-bin/pyspark-3.0.1.tar.gz
--
svn:mime-type = application/octet-stream

Added: dev/spark/v3.0.1-rc1-bin/pyspark-3.0.1.tar.gz.asc
==
--- dev/spark/v3.0.1-rc1-bin/pyspark-3.0.1.tar.gz.asc (added)
+++ dev/spark/v3.0.1-rc1-bin/pyspark-3.0.1.tar.gz.asc Sun Aug 16 06:22:52 2020
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+
+iQJIBAABCgAyFiEEUUb73EuQdE6pSANXleDuOM+Y+fQFAl84v8EUHHJ1aWZlbmd6
+QGFwYWNoZS5vcmcACgkQleDuOM+Y+fShBw/+LrxuLvkXVOH9m5s3P8l7oZn8/W7f
+DhD2j+vG8Fy9GZkfpi5199gpgkaQCRKgaOTZinEG/PD4iITEDMbGj50T5ma//+4c
+Czg/j7Z7NAMb6EqnKQbjtm9YAtX6jaYvckQARj0tLURAQV5BBsExGPbt4rponR92
+BnETSDhKhbe61ThuJ7pU5Gkr5YHZunLG2dMgUpOXzcP4NIDM02LW7Hl27GcCy6JS
+LGomgcxq1YlwcRAHAQU06dnztiIvMwcYVPdloJCQplEFxDk0zRBzYeAT6EVz8DgX
+uMKgIaxOl6v0Vb4Kfz+nrNKB5o5Tbywi+rFyYzshISm+WY+/Pr/3jgFAgLveYa6l
+67h48YTqr8kQHne4LkcLnqXcuTgYAYNbGlabR/xFrfphTPfp8Lv7ZA8/hFcwp9MO
+6Zzz9sthk8+Phs/qFqpUFqkFie8A9nH8UoFwpGeGWXSVTCXrOatZZksiclXHRsiB
+rbwW4Moo887O5m38eDGK60uF/+06q/nF6NZZ0dM1zWbB3ZG47rKHTMB2BHqG/sw5
+sSizbZjB+7lSfFU6HO7u3AuOYfXjQ3Yuxc+J8vLe+Z1GBBsjilLv0bRVQrijS6EQ
+5gF35feXw/kochMncJfKyPJthlfS+4YIbvGHM2G3t7T2cqWNUsa4uH+kxA5OX4w+