Repository: spark
Updated Branches:
refs/heads/master 88661747f -> 22b111ef9
[SPARK-21902][CORE] Print root cause for BlockManager#doPut
## What changes were proposed in this pull request?
As logging below, actually exception will be hidden when removeBlockInternal
throw an exception.
Repository: spark
Updated Branches:
refs/heads/master 352bea545 -> 1da5822e6
[SPARK-21934][CORE] Expose Shuffle Netty memory usage to MetricsSystem
## What changes were proposed in this pull request?
This is a followup work of SPARK-9104 to expose the Netty memory usage to
MetricsSystem.
Repository: spark
Updated Branches:
refs/heads/master 4e6fc6901 -> 4b88393cb
[SPARK-21922] Fix duration always updating when task failed but status is still
RUNâ¦
â¦NING
## What changes were proposed in this pull request?
When driver quit abnormally which cause executor shutdown and task
Repository: spark
Updated Branches:
refs/heads/master c66d64b3d -> 94f7e046a
[SPARK-22030][CORE] GraphiteSink fails to re-connect to Graphite instances
behind an ELB or any other auto-scaled LB
## What changes were proposed in this pull request?
Upgrade codahale metrics library so that
Repository: spark
Updated Branches:
refs/heads/master 313c6ca43 -> cd5d0f337
[SPARK-11574][CORE] Add metrics StatsD sink
This patch adds statsd sink to the current metrics system in spark core.
Author: Xiaofeng Lin
Closes #9518 from xflin/statsd.
Change-Id:
Repository: spark
Updated Branches:
refs/heads/master 9e451bcf3 -> 6a2325448
[SPARK-18061][THRIFTSERVER] Add spnego auth support for ThriftServer
thrift/http protocol
Spark ThriftServer doesn't support spnego auth for thrift/http protocol, this
mainly used for knox+thriftserver scenario.
Repository: spark
Updated Branches:
refs/heads/master e00f1a1da -> c26976fe1
[SPARK-21939][TEST] Use TimeLimits instead of Timeouts
Since ScalaTest 3.0.0, `org.scalatest.concurrent.Timeouts` is deprecated.
This PR replaces the deprecated one with `org.scalatest.concurrent.TimeLimits`.
Repository: spark
Updated Branches:
refs/heads/branch-2.2 42e172744 -> 12a74e352
[SPARK-22135][MESOS] metrics in spark-dispatcher not being registered properly
## What changes were proposed in this pull request?
Fix a trivial bug with how metrics are registered in the mesos dispatcher. Bug
Repository: spark
Updated Branches:
refs/heads/master 3b117d631 -> f20be4d70
[SPARK-22135][MESOS] metrics in spark-dispatcher not being registered properly
## What changes were proposed in this pull request?
Fix a trivial bug with how metrics are registered in the mesos dispatcher. Bug
Repository: spark
Updated Branches:
refs/heads/master 7bf4da8a3 -> 3b117d631
[SPARK-22123][CORE] Add latest failure reason for task set blacklist
## What changes were proposed in this pull request?
This patch add latest failure reason for task set blacklist.Which can be showed
on spark ui
Repository: spark
Updated Branches:
refs/heads/master 74daf622d -> d2b8b63b9
[SAPRK-20785][WEB-UI][SQL] Spark should provide jump links and add (count) in
the SQL web ui.
## What changes were proposed in this pull request?
propose:
it provide links that jump to Running Queries,Completed
Repository: spark-website
Updated Branches:
refs/heads/asf-site 8f64443a4 -> 434db70b4
Update committer page
Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/434db70b
Tree:
Repository: spark
Updated Branches:
refs/heads/master cd5d0f337 -> 4482ff23a
[SPARK-17321][YARN] Avoid writing shuffle metadata to disk if NM recovery is
disabled
In the current code, if NM recovery is not enabled then `YarnShuffleService`
will write shuffle metadata to NM local dir-1, if
Repository: spark
Updated Branches:
refs/heads/master c998a2ae0 -> fe7b219ae
[SPARK-22074][CORE] Task killed by other attempt task should not be resubmitted
## What changes were proposed in this pull request?
As the detail scenario described in
Repository: spark
Updated Branches:
refs/heads/master 6f1d0dea1 -> dc2714da5
[SPARK-22290][CORE] Avoid creating Hive delegation tokens when not necessary.
Hive delegation tokens are only needed when the Spark driver has no access
to the kerberos TGT. That happens only in two situations:
-
Repository: spark
Updated Branches:
refs/heads/master 556b5d215 -> 96798d14f
[SPARK-22172][CORE] Worker hangs when the external shuffle service port is
already in use
## What changes were proposed in this pull request?
Handling the NonFatal exceptions while starting the external shuffle
Repository: spark
Updated Branches:
refs/heads/master 592cfeab9 -> 3073344a2
[SPARK-21840][CORE] Add trait that allows conf to be directly set in
application.
Currently SparkSubmit uses system properties to propagate configuration to
applications. This makes it hard to implement features
Repository: spark
Updated Branches:
refs/heads/master 0c23e254c -> 7f1b6b182
[SPARK-24136][SS] Fix MemoryStreamDataReader.next to skip sleeping if record is
available
## What changes were proposed in this pull request?
Avoid unnecessary sleep (10 ms) in each invocation of
Repository: spark
Updated Branches:
refs/heads/master 80c6d35a3 -> 4a2b15f0a
[SPARK-24241][SUBMIT] Do not fail fast when dynamic resource allocation enabled
with 0 executor
## What changes were proposed in this pull request?
```
~/spark-2.3.0-bin-hadoop2.7$ bin/spark-sql --num-executors 0
Repository: spark
Updated Branches:
refs/heads/master cd12c5c3e -> 05eb19b6e
[SPARK-24188][CORE] Restore "/version" API endpoint.
It was missing the jax-rs annotation.
Author: Marcelo Vanzin
Closes #21245 from vanzin/SPARK-24188.
Change-Id:
Repository: spark
Updated Branches:
refs/heads/branch-2.3 4dc6719e9 -> aba52f449
[SPARK-24188][CORE] Restore "/version" API endpoint.
It was missing the jax-rs annotation.
Author: Marcelo Vanzin
Closes #21245 from vanzin/SPARK-24188.
Change-Id:
Repository: spark
Updated Branches:
refs/heads/master 8aa1d7b0e -> 109935fc5
[SPARK-23830][YARN] added check to ensure main method is found
## What changes were proposed in this pull request?
When a user specifies the wrong class -- or, in fact, a class instead of an
object -- Spark throws
Repository: spark
Updated Branches:
refs/heads/master 8614edd44 -> 1fb46f30f
[SPARK-23688][SS] Refactor tests away from rate source
## What changes were proposed in this pull request?
Replace rate source with memory source in continuous mode test suite. Keep
using "rate" source if the tests
Repository: spark
Updated Branches:
refs/heads/master ad94e8592 -> 4df51361a
[SPARK-22732][SS][FOLLOW-UP] Fix MemorySinkV2 toString error
## What changes were proposed in this pull request?
Fix `MemorySinkV2` toString() error
## How was this patch tested?
N/A
Author: Yuming Wang
Repository: spark
Updated Branches:
refs/heads/master 23db600c9 -> aca65c63c
[SPARK-23991][DSTREAMS] Fix data loss when WAL write fails in
allocateBlocksToBatch
When blocks tried to get allocated to a batch and WAL write fails then the
blocks will be removed from the received block queue.
Repository: spark
Updated Branches:
refs/heads/branch-2.3 fec43fe1b -> 49a6c2b91
[SPARK-23991][DSTREAMS] Fix data loss when WAL write fails in
allocateBlocksToBatch
When blocks tried to get allocated to a batch and WAL write fails then the
blocks will be removed from the received block
Repository: spark
Updated Branches:
refs/heads/master 21e1fc7d4 -> 2c9c8629b
[MINOR][YARN] Add YARN-specific credential providers in debug logging message
This PR adds a debugging log for YARN-specific credential providers which is
loaded by service loader mechanism.
It took me a while to
Author: jshao
Date: Mon Jul 2 12:18:41 2018
New Revision: 27854
Log:
Update KEYS
Modified:
dev/spark/KEYS
Modified: dev/spark/KEYS
==
--- dev/spark/KEYS (original)
+++ dev/spark/KEYS Mon Jul 2 12:18:41 2018
Repository: spark
Updated Branches:
refs/heads/master 4c059ebc6 -> c7967c604
[SPARK-24418][BUILD] Upgrade Scala to 2.11.12 and 2.12.6
## What changes were proposed in this pull request?
Scala is upgraded to `2.11.12` and `2.12.6`.
We used `loadFIles()` in `ILoop` as a hook to initialize the
Repository: spark
Updated Branches:
refs/heads/master e4c91c089 -> bf4352ca6
[SPARK-24110][THRIFT-SERVER] Avoid UGI.loginUserFromKeytab in STS
## What changes were proposed in this pull request?
Spark ThriftServer will call UGI.loginUserFromKeytab twice in initialization.
This is
Repository: spark
Updated Branches:
refs/heads/branch-2.3 096defdd7 -> 07ec75ca0
[SPARK-24062][THRIFT SERVER] Fix SASL encryption cannot enabled issue in thrift
server
## What changes were proposed in this pull request?
For the details of the exception please see
Repository: spark
Updated Branches:
refs/heads/master cd10f9df8 -> ffaf0f9fd
[SPARK-24062][THRIFT SERVER] Fix SASL encryption cannot enabled issue in thrift
server
## What changes were proposed in this pull request?
For the details of the exception please see
Repository: spark
Updated Branches:
refs/heads/master ca2a780e7 -> 57accf6e3
[SPARK-22319][CORE] call loginUserFromKeytab before accessing hdfs
In `SparkSubmit`, call `loginUserFromKeytab` before attempting to make RPC
calls to the NameNode.
I manually tested this patch by:
1. Confirming
Repository: spark
Updated Branches:
refs/heads/branch-2.2 f8c83fdc5 -> bf8163f5b
[SPARK-22319][CORE][BACKPORT-2.2] call loginUserFromKeytab before accessing hdfs
In SparkSubmit, call loginUserFromKeytab before attempting to make RPC calls to
the NameNode.
Same as
Repository: spark
Updated Branches:
refs/heads/master 9b33dfc40 -> a6647ffbf
[SPARK-22587] Spark job fails if fs.defaultFS and application jar are different
url
## What changes were proposed in this pull request?
Two filesystems comparing does not consider the authority of URI. This is
Repository: spark
Updated Branches:
refs/heads/branch-2.3 551ccfba5 -> 317b0aaed
[SPARK-22587] Spark job fails if fs.defaultFS and application jar are different
url
## What changes were proposed in this pull request?
Two filesystems comparing does not consider the authority of URI. This is
Repository: spark
Updated Branches:
refs/heads/branch-2.3 7520491bf -> 5781fa79e
[SPARK-22976][CORE] Cluster mode driver dir removed while running
## What changes were proposed in this pull request?
The clean up logic on the worker perviously determined the liveness of a
particular
Repository: spark
Updated Branches:
refs/heads/master 602c6d82d -> 11daeb833
[SPARK-22976][CORE] Cluster mode driver dir removed while running
## What changes were proposed in this pull request?
The clean up logic on the worker perviously determined the liveness of a
particular applicaiton
Repository: spark
Updated Branches:
refs/heads/master ec2289761 -> 60175e959
[MINOR][DOC] Fix the path to the examples jar
## What changes were proposed in this pull request?
The example jar file is now in ./examples/jars directory of Spark distribution.
Author: Arseniy Tashoyan
Repository: spark
Updated Branches:
refs/heads/branch-2.3 57c320a0d -> cf078a205
[MINOR][DOC] Fix the path to the examples jar
## What changes were proposed in this pull request?
The example jar file is now in ./examples/jars directory of Spark distribution.
Author: Arseniy Tashoyan
Repository: spark
Updated Branches:
refs/heads/master 70a68b328 -> d1721816d
[SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore
## What changes were proposed in this pull request?
When using the Kubernetes cluster-manager and spawning a Streaming workload, it
is important
Repository: spark
Updated Branches:
refs/heads/master ca04c3ff2 -> 8c6a9c90a
[SPARK-23279][SS] Avoid triggering distributed job for Console sink
## What changes were proposed in this pull request?
Console sink will redistribute collected local data and trigger a distributed
job in each
Repository: spark
Updated Branches:
refs/heads/branch-2.3 b8778321b -> ab5a51055
[SPARK-23279][SS] Avoid triggering distributed job for Console sink
## What changes were proposed in this pull request?
Console sink will redistribute collected local data and trigger a distributed
job in each
Repository: spark
Updated Branches:
refs/heads/master b6b50efc8 -> 4b7cd479a
Revert "[SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore"
This reverts commit d1721816d26bedee3c72eeb75db49da500568376.
The patch is not fully tested and out-of-date. So revert it.
Project:
Repository: spark
Updated Branches:
refs/heads/master f235df66a -> 31bd1dab1
[SPARK-23088][CORE] History server not showing incomplete/running applications
## What changes were proposed in this pull request?
History server not showing incomplete/running applications when
Repository: spark
Updated Branches:
refs/heads/branch-2.3 136588e95 -> 9fb70f458
[SPARK-24948][SHS][BACKPORT-2.3] Delegate check access permissions to the file
system
## What changes were proposed in this pull request?
In `SparkHadoopUtil. checkAccessPermission`, we consider only basic
Repository: spark
Updated Branches:
refs/heads/branch-2.2 a5624c7ae -> 53ac8504b
[SPARK-24948][SHS][BACKPORT-2.2] Delegate check access permissions to the file
system
## What changes were proposed in this pull request?
In `SparkHadoopUtil. checkAccessPermission`, we consider only basic
Repository: spark
Updated Branches:
refs/heads/branch-2.3 787790b3c -> 29a040361
Preparing Spark release v2.3.2-rc5
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4dc82259
Tree:
Repository: spark
Updated Tags: refs/tags/v2.3.2-rc5 [created] 4dc82259d
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Preparing development version 2.3.3-SNAPSHOT
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/29a04036
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/29a04036
Diff:
Author: jshao
Date: Tue Aug 14 06:54:52 2018
New Revision: 28707
Log:
Apache Spark v2.3.2-rc5 docs
[This commit notification would consist of 1446 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary
Repository: spark
Updated Tags: refs/tags/v2.3.2-rc4 [created] 6930f4885
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Preparing development version 2.3.3-SNAPSHOT
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e66f3f9b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/e66f3f9b
Diff:
Repository: spark
Updated Branches:
refs/heads/branch-2.3 b426ec583 -> e66f3f9b1
Preparing Spark release v2.3.2-rc4
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6930f488
Tree:
Author: jshao
Date: Fri Aug 10 04:58:55 2018
New Revision: 28647
Log:
Apache Spark v2.3.2-rc4
Added:
dev/spark/v2.3.2-rc4-bin/
dev/spark/v2.3.2-rc4-bin/SparkR_2.3.2.tar.gz (with props)
dev/spark/v2.3.2-rc4-bin/SparkR_2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc4-bin/SparkR_2.3.2
Author: jshao
Date: Fri Aug 10 05:50:52 2018
New Revision: 28649
Log:
Apache Spark v2.3.2-rc4 docs
[This commit notification would consist of 1446 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary
Author: jshao
Date: Tue Aug 14 04:02:50 2018
New Revision: 28702
Log:
Apache Spark v2.3.2-rc5
Added:
dev/spark/v2.3.2-rc5-bin/
dev/spark/v2.3.2-rc5-bin/SparkR_2.3.2.tar.gz (with props)
dev/spark/v2.3.2-rc5-bin/SparkR_2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc5-bin/SparkR_2.3.2
Repository: spark
Updated Branches:
refs/heads/master 67e108daa -> 7e847646d
[SPARK-24307][CORE] Support reading remote cached partitions > 2gb
(1) Netty's ByteBuf cannot support data > 2gb. So to transfer data from a
ChunkedByteBuffer over the network, we use a custom version of
FileRegion
Repository: spark
Updated Branches:
refs/heads/master 7e847646d -> 7db81ac8a
[SPARK-24195][CORE] Ignore the files with "local" scheme in SparkContext.addFile
## What changes were proposed in this pull request?
In Spark "local" scheme means resources are already on the driver/executor
nodes,
Author: jshao
Date: Sun Jul 15 07:30:18 2018
New Revision: 28123
Log:
Apache Spark v2.3.2-rc3 docs
[This commit notification would consist of 1446 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary
Author: jshao
Date: Sun Jul 15 03:04:30 2018
New Revision: 28118
Log:
Apache Spark v2.3.2-rc3
Added:
dev/spark/v2.3.2-rc3-bin/
dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz (with props)
dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2
Repository: spark
Updated Branches:
refs/heads/branch-2.3 9cf375f5b -> f9a2b0a87
Preparing Spark release v2.3.2-rc3
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b3726dad
Tree:
Preparing development version 2.3.3-SNAPSHOT
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f9a2b0a8
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f9a2b0a8
Diff:
Repository: spark
Updated Tags: refs/tags/v2.3.2-rc3 [created] b3726dadc
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Repository: spark
Updated Branches:
refs/heads/master cfc3e1aaa -> d2436a852
[SPARK-24594][YARN] Introducing metrics for YARN
## What changes were proposed in this pull request?
In this PR metrics are introduced for YARN. As up to now there was no metrics
in the YARN module a new metric
Repository: spark
Updated Branches:
refs/heads/master 3efdf3532 -> 15fff7903
[SPARK-24297][CORE] Fetch-to-disk by default for > 2gb
Fetch-to-mem is guaranteed to fail if the message is bigger than 2 GB,
so we might as well use fetch-to-disk in that case. The message includes
some metadata in
Repository: spark
Updated Branches:
refs/heads/master aa70a0a1a -> 515708d5f
[SPARK-25183][SQL] Spark HiveServer2 to use Spark ShutdownHookManager
## What changes were proposed in this pull request?
Switch `org.apache.hive.service.server.HiveServer2` to register its shutdown
callback with
Repository: spark
Updated Branches:
refs/heads/master 79c668942 -> e2c7e09f7
[SPARK-24646][CORE] Minor change to spark.yarn.dist.forceDownloadSchemes to
support wildcard '*'
## What changes were proposed in this pull request?
In the case of getting tokens via customized
Repository: spark
Updated Branches:
refs/heads/master a28900956 -> 6fe32869c
[SPARK-24678][SPARK-STREAMING] Give priority in use of 'PROCESS_LOCAL' for
spark-streaming
## What changes were proposed in this pull request?
Currently, `BlockRDD.getPreferredLocations` only get hosts info of
Repository: spark
Updated Branches:
refs/heads/branch-2.3 19542f5de -> 86457a16d
Preparing Spark release v2.3.2-rc2
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/307499e1
Tree:
Repository: spark
Updated Tags: refs/tags/v2.3.2-rc2 [created] 307499e1a
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Preparing development version 2.3.3-SNAPSHOT
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/86457a16
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/86457a16
Diff:
Repository: spark
Updated Branches:
refs/heads/branch-2.3 64c72b4de -> 72eb97ce9
Preparing Spark release v2.3.2-rc1
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4df06b45
Tree:
Repository: spark
Updated Tags: refs/tags/v2.3.2-rc1 [created] 4df06b451
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
Preparing development version 2.3.3-SNAPSHOT
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/72eb97ce
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/72eb97ce
Diff:
Author: jshao
Date: Sun Jul 8 04:41:12 2018
New Revision: 27981
Log:
Apache Spark 2.3.2
Added:
dev/spark/v2.3.2-rc1-bin/
dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz (with props)
dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.asc
dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2
Author: jshao
Date: Sun Jul 8 07:45:04 2018
New Revision: 27983
Log:
Apache Spark v2.3.2-rc1 docs
[This commit notification would consist of 1446 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary
Repository: spark
Updated Branches:
refs/heads/master c2632edeb -> ca83526de
[SPARK-23644][CORE][UI] Use absolute path for REST call in SHS
## What changes were proposed in this pull request?
SHS is using a relative path for the REST API call to get the list of the
application is a relative
Repository: spark
Updated Branches:
refs/heads/master ca83526de -> c95200048
[SPARK-23635][YARN] AM env variable should not overwrite same name env variable
set through spark.executorEnv.
## What changes were proposed in this pull request?
In the current Spark on YARN code, AM always will
Repository: spark
Updated Branches:
refs/heads/master 4c587eb48 -> 04e71c316
[MINOR][YARN] Add disable yarn.nodemanager.vmem-check-enabled option to
memLimitExceededLogMessage
My spark application sometimes will throw `Container killed by YARN for
exceeding memory limits`.
Even I increased
Repository: spark
Updated Branches:
refs/heads/master 73f28530d -> c0964935d
[SPARK-23956][YARN] Use effective RPC port in AM registration
## What changes were proposed in this pull request?
We propose not to hard-code the RPC port in the AM registration.
## How was this patch tested?
Repository: spark
Updated Branches:
refs/heads/branch-2.3 130641102 -> 32bec6ca3
[SPARK-24014][PYSPARK] Add onStreamingStarted method to StreamingListener
## What changes were proposed in this pull request?
The `StreamingListener` in PySpark side seems to be lack of
`onStreamingStarted`
Repository: spark
Updated Branches:
refs/heads/master 0c94e48bc -> 8bb0df2c6
[SPARK-24014][PYSPARK] Add onStreamingStarted method to StreamingListener
## What changes were proposed in this pull request?
The `StreamingListener` in PySpark side seems to be lack of
`onStreamingStarted` method.
Repository: spark
Updated Branches:
refs/heads/master 087fb3142 -> eb48edf9c
[SPARK-23787][TESTS] Fix file download test in SparkSubmitSuite for Hadoop 2.9.
This particular test assumed that Hadoop libraries did not support
http as a file system. Hadoop 2.9 does, so the test failed. The test
Repository: spark
Updated Branches:
refs/heads/master b34890119 -> df05fb63a
[SPARK-23743][SQL] Changed a comparison logic from containing 'slf4j' to
starting with 'org.slf4j'
## What changes were proposed in this pull request?
isSharedClass returns if some classes can/should be shared or
Repository: spark
Updated Branches:
refs/heads/master b2edc30db -> 5fa438471
[SPARK-23361][YARN] Allow AM to restart after initial tokens expire.
Currently, the Spark AM relies on the initial set of tokens created by
the submission client to be able to talk to HDFS and other services that
Repository: spark
Updated Branches:
refs/heads/branch-2.3 5c1c03d08 -> 2f82c037d
[SPARK-23644][CORE][UI][BACKPORT-2.3] Use absolute path for REST call in SHS
## What changes were proposed in this pull request?
SHS is using a relative path for the REST API call to get the list of the
Repository: spark
Updated Branches:
refs/heads/master 61487b308 -> 745c8c090
[SPARK-23708][CORE] Correct comment for function addShutDownHook in
ShutdownHookManager
## What changes were proposed in this pull request?
Minor modification.Comment below is not right.
```
/**
* Adds a shutdown
Repository: spark
Updated Tags: refs/tags/v2.3.2 [created] 02b510728
-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/cov.html
--
diff --git a/site/docs/2.3.2/api/R/cov.html b/site/docs/2.3.2/api/R/cov.html
new file mode 100644
index 000..ec96abb
---
Repository: spark-website
Updated Branches:
refs/heads/asf-site 04a27dbf1 -> 546f35143
Empty commit to trigger asf to github sync
Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/546f3514
Tree:
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/match.html
--
diff --git a/site/docs/2.3.2/api/R/match.html b/site/docs/2.3.2/api/R/match.html
new file mode 100644
index 000..d405b90
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SimpleFutureAction.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/SimpleFutureAction.html
Repository: spark-website
Updated Branches:
refs/heads/asf-site 806a1bd52 -> 04a27dbf1
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/r/PairwiseRRDD.html
--
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachFunction.html
--
diff --git
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkStatusTracker.html
--
diff --git
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/write.jdbc.html
--
diff --git a/site/docs/2.3.2/api/R/write.jdbc.html
b/site/docs/2.3.2/api/R/write.jdbc.html
new file mode 100644
index
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/constant-values.html
--
diff --git a/site/docs/2.3.2/api/java/constant-values.html
b/site/docs/2.3.2/api/java/constant-values.html
new
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDDLike.html
--
diff --git
a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDDLike.html
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/TaskCommitDenied.html
--
diff --git a/site/docs/2.3.2/api/java/org/apache/spark/TaskCommitDenied.html
1 - 100 of 153 matches
Mail list logo