spark git commit: [SPARK-21902][CORE] Print root cause for BlockManager#doPut

2017-09-15 Thread jshao
Repository: spark Updated Branches: refs/heads/master 88661747f -> 22b111ef9 [SPARK-21902][CORE] Print root cause for BlockManager#doPut ## What changes were proposed in this pull request? As logging below, actually exception will be hidden when removeBlockInternal throw an exception.

spark git commit: [SPARK-21934][CORE] Expose Shuffle Netty memory usage to MetricsSystem

2017-09-20 Thread jshao
Repository: spark Updated Branches: refs/heads/master 352bea545 -> 1da5822e6 [SPARK-21934][CORE] Expose Shuffle Netty memory usage to MetricsSystem ## What changes were proposed in this pull request? This is a followup work of SPARK-9104 to expose the Netty memory usage to MetricsSystem.

spark git commit: [SPARK-21922] Fix duration always updating when task failed but status is still RUN…

2017-09-14 Thread jshao
Repository: spark Updated Branches: refs/heads/master 4e6fc6901 -> 4b88393cb [SPARK-21922] Fix duration always updating when task failed but status is still RUN… …NING ## What changes were proposed in this pull request? When driver quit abnormally which cause executor shutdown and task

spark git commit: [SPARK-22030][CORE] GraphiteSink fails to re-connect to Graphite instances behind an ELB or any other auto-scaled LB

2017-09-18 Thread jshao
Repository: spark Updated Branches: refs/heads/master c66d64b3d -> 94f7e046a [SPARK-22030][CORE] GraphiteSink fails to re-connect to Graphite instances behind an ELB or any other auto-scaled LB ## What changes were proposed in this pull request? Upgrade codahale metrics library so that

spark git commit: [SPARK-11574][CORE] Add metrics StatsD sink

2017-08-30 Thread jshao
Repository: spark Updated Branches: refs/heads/master 313c6ca43 -> cd5d0f337 [SPARK-11574][CORE] Add metrics StatsD sink This patch adds statsd sink to the current metrics system in spark core. Author: Xiaofeng Lin Closes #9518 from xflin/statsd. Change-Id:

spark git commit: [SPARK-18061][THRIFTSERVER] Add spnego auth support for ThriftServer thrift/http protocol

2017-09-05 Thread jshao
Repository: spark Updated Branches: refs/heads/master 9e451bcf3 -> 6a2325448 [SPARK-18061][THRIFTSERVER] Add spnego auth support for ThriftServer thrift/http protocol Spark ThriftServer doesn't support spnego auth for thrift/http protocol, this mainly used for knox+thriftserver scenario.

spark git commit: [SPARK-21939][TEST] Use TimeLimits instead of Timeouts

2017-09-07 Thread jshao
Repository: spark Updated Branches: refs/heads/master e00f1a1da -> c26976fe1 [SPARK-21939][TEST] Use TimeLimits instead of Timeouts Since ScalaTest 3.0.0, `org.scalatest.concurrent.Timeouts` is deprecated. This PR replaces the deprecated one with `org.scalatest.concurrent.TimeLimits`.

spark git commit: [SPARK-22135][MESOS] metrics in spark-dispatcher not being registered properly

2017-09-28 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.2 42e172744 -> 12a74e352 [SPARK-22135][MESOS] metrics in spark-dispatcher not being registered properly ## What changes were proposed in this pull request? Fix a trivial bug with how metrics are registered in the mesos dispatcher. Bug

spark git commit: [SPARK-22135][MESOS] metrics in spark-dispatcher not being registered properly

2017-09-28 Thread jshao
Repository: spark Updated Branches: refs/heads/master 3b117d631 -> f20be4d70 [SPARK-22135][MESOS] metrics in spark-dispatcher not being registered properly ## What changes were proposed in this pull request? Fix a trivial bug with how metrics are registered in the mesos dispatcher. Bug

spark git commit: [SPARK-22123][CORE] Add latest failure reason for task set blacklist

2017-09-27 Thread jshao
Repository: spark Updated Branches: refs/heads/master 7bf4da8a3 -> 3b117d631 [SPARK-22123][CORE] Add latest failure reason for task set blacklist ## What changes were proposed in this pull request? This patch add latest failure reason for task set blacklist.Which can be showed on spark ui

spark git commit: [SAPRK-20785][WEB-UI][SQL] Spark should provide jump links and add (count) in the SQL web ui.

2017-09-27 Thread jshao
Repository: spark Updated Branches: refs/heads/master 74daf622d -> d2b8b63b9 [SAPRK-20785][WEB-UI][SQL] Spark should provide jump links and add (count) in the SQL web ui. ## What changes were proposed in this pull request? propose: it provide links that jump to Running Queries,Completed

spark-website git commit: Update committer page

2017-08-29 Thread jshao
Repository: spark-website Updated Branches: refs/heads/asf-site 8f64443a4 -> 434db70b4 Update committer page Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/434db70b Tree:

spark git commit: [SPARK-17321][YARN] Avoid writing shuffle metadata to disk if NM recovery is disabled

2017-08-30 Thread jshao
Repository: spark Updated Branches: refs/heads/master cd5d0f337 -> 4482ff23a [SPARK-17321][YARN] Avoid writing shuffle metadata to disk if NM recovery is disabled In the current code, if NM recovery is not enabled then `YarnShuffleService` will write shuffle metadata to NM local dir-1, if

spark git commit: [SPARK-22074][CORE] Task killed by other attempt task should not be resubmitted

2017-10-09 Thread jshao
Repository: spark Updated Branches: refs/heads/master c998a2ae0 -> fe7b219ae [SPARK-22074][CORE] Task killed by other attempt task should not be resubmitted ## What changes were proposed in this pull request? As the detail scenario described in

spark git commit: [SPARK-22290][CORE] Avoid creating Hive delegation tokens when not necessary.

2017-10-19 Thread jshao
Repository: spark Updated Branches: refs/heads/master 6f1d0dea1 -> dc2714da5 [SPARK-22290][CORE] Avoid creating Hive delegation tokens when not necessary. Hive delegation tokens are only needed when the Spark driver has no access to the kerberos TGT. That happens only in two situations: -

spark git commit: [SPARK-22172][CORE] Worker hangs when the external shuffle service port is already in use

2017-11-01 Thread jshao
Repository: spark Updated Branches: refs/heads/master 556b5d215 -> 96798d14f [SPARK-22172][CORE] Worker hangs when the external shuffle service port is already in use ## What changes were proposed in this pull request? Handling the NonFatal exceptions while starting the external shuffle

spark git commit: [SPARK-21840][CORE] Add trait that allows conf to be directly set in application.

2017-10-26 Thread jshao
Repository: spark Updated Branches: refs/heads/master 592cfeab9 -> 3073344a2 [SPARK-21840][CORE] Add trait that allows conf to be directly set in application. Currently SparkSubmit uses system properties to propagate configuration to applications. This makes it hard to implement features

spark git commit: [SPARK-24136][SS] Fix MemoryStreamDataReader.next to skip sleeping if record is available

2018-05-04 Thread jshao
Repository: spark Updated Branches: refs/heads/master 0c23e254c -> 7f1b6b182 [SPARK-24136][SS] Fix MemoryStreamDataReader.next to skip sleeping if record is available ## What changes were proposed in this pull request? Avoid unnecessary sleep (10 ms) in each invocation of

spark git commit: [SPARK-24241][SUBMIT] Do not fail fast when dynamic resource allocation enabled with 0 executor

2018-05-15 Thread jshao
Repository: spark Updated Branches: refs/heads/master 80c6d35a3 -> 4a2b15f0a [SPARK-24241][SUBMIT] Do not fail fast when dynamic resource allocation enabled with 0 executor ## What changes were proposed in this pull request? ``` ~/spark-2.3.0-bin-hadoop2.7$ bin/spark-sql --num-executors 0

spark git commit: [SPARK-24188][CORE] Restore "/version" API endpoint.

2018-05-08 Thread jshao
Repository: spark Updated Branches: refs/heads/master cd12c5c3e -> 05eb19b6e [SPARK-24188][CORE] Restore "/version" API endpoint. It was missing the jax-rs annotation. Author: Marcelo Vanzin Closes #21245 from vanzin/SPARK-24188. Change-Id:

spark git commit: [SPARK-24188][CORE] Restore "/version" API endpoint.

2018-05-08 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 4dc6719e9 -> aba52f449 [SPARK-24188][CORE] Restore "/version" API endpoint. It was missing the jax-rs annotation. Author: Marcelo Vanzin Closes #21245 from vanzin/SPARK-24188. Change-Id:

spark git commit: [SPARK-23830][YARN] added check to ensure main method is found

2018-04-27 Thread jshao
Repository: spark Updated Branches: refs/heads/master 8aa1d7b0e -> 109935fc5 [SPARK-23830][YARN] added check to ensure main method is found ## What changes were proposed in this pull request? When a user specifies the wrong class -- or, in fact, a class instead of an object -- Spark throws

spark git commit: [SPARK-23688][SS] Refactor tests away from rate source

2018-04-27 Thread jshao
Repository: spark Updated Branches: refs/heads/master 8614edd44 -> 1fb46f30f [SPARK-23688][SS] Refactor tests away from rate source ## What changes were proposed in this pull request? Replace rate source with memory source in continuous mode test suite. Keep using "rate" source if the tests

spark git commit: [SPARK-22732][SS][FOLLOW-UP] Fix MemorySinkV2 toString error

2018-04-28 Thread jshao
Repository: spark Updated Branches: refs/heads/master ad94e8592 -> 4df51361a [SPARK-22732][SS][FOLLOW-UP] Fix MemorySinkV2 toString error ## What changes were proposed in this pull request? Fix `MemorySinkV2` toString() error ## How was this patch tested? N/A Author: Yuming Wang

spark git commit: [SPARK-23991][DSTREAMS] Fix data loss when WAL write fails in allocateBlocksToBatch

2018-05-29 Thread jshao
Repository: spark Updated Branches: refs/heads/master 23db600c9 -> aca65c63c [SPARK-23991][DSTREAMS] Fix data loss when WAL write fails in allocateBlocksToBatch When blocks tried to get allocated to a batch and WAL write fails then the blocks will be removed from the received block queue.

spark git commit: [SPARK-23991][DSTREAMS] Fix data loss when WAL write fails in allocateBlocksToBatch

2018-05-29 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 fec43fe1b -> 49a6c2b91 [SPARK-23991][DSTREAMS] Fix data loss when WAL write fails in allocateBlocksToBatch When blocks tried to get allocated to a batch and WAL write fails then the blocks will be removed from the received block

spark git commit: [MINOR][YARN] Add YARN-specific credential providers in debug logging message

2018-05-31 Thread jshao
Repository: spark Updated Branches: refs/heads/master 21e1fc7d4 -> 2c9c8629b [MINOR][YARN] Add YARN-specific credential providers in debug logging message This PR adds a debugging log for YARN-specific credential providers which is loaded by service loader mechanism. It took me a while to

svn commit: r27854 - /dev/spark/KEYS

2018-07-02 Thread jshao
Author: jshao Date: Mon Jul 2 12:18:41 2018 New Revision: 27854 Log: Update KEYS Modified: dev/spark/KEYS Modified: dev/spark/KEYS == --- dev/spark/KEYS (original) +++ dev/spark/KEYS Mon Jul 2 12:18:41 2018

spark git commit: [SPARK-24418][BUILD] Upgrade Scala to 2.11.12 and 2.12.6

2018-06-25 Thread jshao
Repository: spark Updated Branches: refs/heads/master 4c059ebc6 -> c7967c604 [SPARK-24418][BUILD] Upgrade Scala to 2.11.12 and 2.12.6 ## What changes were proposed in this pull request? Scala is upgraded to `2.11.12` and `2.12.6`. We used `loadFIles()` in `ILoop` as a hook to initialize the

spark git commit: [SPARK-24110][THRIFT-SERVER] Avoid UGI.loginUserFromKeytab in STS

2018-05-02 Thread jshao
Repository: spark Updated Branches: refs/heads/master e4c91c089 -> bf4352ca6 [SPARK-24110][THRIFT-SERVER] Avoid UGI.loginUserFromKeytab in STS ## What changes were proposed in this pull request? Spark ThriftServer will call UGI.loginUserFromKeytab twice in initialization. This is

spark git commit: [SPARK-24062][THRIFT SERVER] Fix SASL encryption cannot enabled issue in thrift server

2018-04-25 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 096defdd7 -> 07ec75ca0 [SPARK-24062][THRIFT SERVER] Fix SASL encryption cannot enabled issue in thrift server ## What changes were proposed in this pull request? For the details of the exception please see

spark git commit: [SPARK-24062][THRIFT SERVER] Fix SASL encryption cannot enabled issue in thrift server

2018-04-25 Thread jshao
Repository: spark Updated Branches: refs/heads/master cd10f9df8 -> ffaf0f9fd [SPARK-24062][THRIFT SERVER] Fix SASL encryption cannot enabled issue in thrift server ## What changes were proposed in this pull request? For the details of the exception please see

spark git commit: [SPARK-22319][CORE] call loginUserFromKeytab before accessing hdfs

2017-10-22 Thread jshao
Repository: spark Updated Branches: refs/heads/master ca2a780e7 -> 57accf6e3 [SPARK-22319][CORE] call loginUserFromKeytab before accessing hdfs In `SparkSubmit`, call `loginUserFromKeytab` before attempting to make RPC calls to the NameNode. I manually tested this patch by: 1. Confirming

spark git commit: [SPARK-22319][CORE][BACKPORT-2.2] call loginUserFromKeytab before accessing hdfs

2017-10-23 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.2 f8c83fdc5 -> bf8163f5b [SPARK-22319][CORE][BACKPORT-2.2] call loginUserFromKeytab before accessing hdfs In SparkSubmit, call loginUserFromKeytab before attempting to make RPC calls to the NameNode. Same as

spark git commit: [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url

2018-01-10 Thread jshao
Repository: spark Updated Branches: refs/heads/master 9b33dfc40 -> a6647ffbf [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url ## What changes were proposed in this pull request? Two filesystems comparing does not consider the authority of URI. This is

spark git commit: [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url

2018-01-10 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 551ccfba5 -> 317b0aaed [SPARK-22587] Spark job fails if fs.defaultFS and application jar are different url ## What changes were proposed in this pull request? Two filesystems comparing does not consider the authority of URI. This is

spark git commit: [SPARK-22976][CORE] Cluster mode driver dir removed while running

2018-01-21 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 7520491bf -> 5781fa79e [SPARK-22976][CORE] Cluster mode driver dir removed while running ## What changes were proposed in this pull request? The clean up logic on the worker perviously determined the liveness of a particular

spark git commit: [SPARK-22976][CORE] Cluster mode driver dir removed while running

2018-01-21 Thread jshao
Repository: spark Updated Branches: refs/heads/master 602c6d82d -> 11daeb833 [SPARK-22976][CORE] Cluster mode driver dir removed while running ## What changes were proposed in this pull request? The clean up logic on the worker perviously determined the liveness of a particular applicaiton

spark git commit: [MINOR][DOC] Fix the path to the examples jar

2018-01-22 Thread jshao
Repository: spark Updated Branches: refs/heads/master ec2289761 -> 60175e959 [MINOR][DOC] Fix the path to the examples jar ## What changes were proposed in this pull request? The example jar file is now in ./examples/jars directory of Spark distribution. Author: Arseniy Tashoyan

spark git commit: [MINOR][DOC] Fix the path to the examples jar

2018-01-22 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 57c320a0d -> cf078a205 [MINOR][DOC] Fix the path to the examples jar ## What changes were proposed in this pull request? The example jar file is now in ./examples/jars directory of Spark distribution. Author: Arseniy Tashoyan

spark git commit: [SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore

2018-01-25 Thread jshao
Repository: spark Updated Branches: refs/heads/master 70a68b328 -> d1721816d [SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore ## What changes were proposed in this pull request? When using the Kubernetes cluster-manager and spawning a Streaming workload, it is important

spark git commit: [SPARK-23279][SS] Avoid triggering distributed job for Console sink

2018-01-30 Thread jshao
Repository: spark Updated Branches: refs/heads/master ca04c3ff2 -> 8c6a9c90a [SPARK-23279][SS] Avoid triggering distributed job for Console sink ## What changes were proposed in this pull request? Console sink will redistribute collected local data and trigger a distributed job in each

spark git commit: [SPARK-23279][SS] Avoid triggering distributed job for Console sink

2018-01-30 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 b8778321b -> ab5a51055 [SPARK-23279][SS] Avoid triggering distributed job for Console sink ## What changes were proposed in this pull request? Console sink will redistribute collected local data and trigger a distributed job in each

spark git commit: Revert "[SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore"

2018-01-31 Thread jshao
Repository: spark Updated Branches: refs/heads/master b6b50efc8 -> 4b7cd479a Revert "[SPARK-23200] Reset Kubernetes-specific config on Checkpoint restore" This reverts commit d1721816d26bedee3c72eeb75db49da500568376. The patch is not fully tested and out-of-date. So revert it. Project:

spark git commit: [SPARK-23088][CORE] History server not showing incomplete/running applications

2018-01-29 Thread jshao
Repository: spark Updated Branches: refs/heads/master f235df66a -> 31bd1dab1 [SPARK-23088][CORE] History server not showing incomplete/running applications ## What changes were proposed in this pull request? History server not showing incomplete/running applications when

spark git commit: [SPARK-24948][SHS][BACKPORT-2.3] Delegate check access permissions to the file system

2018-08-07 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 136588e95 -> 9fb70f458 [SPARK-24948][SHS][BACKPORT-2.3] Delegate check access permissions to the file system ## What changes were proposed in this pull request? In `SparkHadoopUtil. checkAccessPermission`, we consider only basic

spark git commit: [SPARK-24948][SHS][BACKPORT-2.2] Delegate check access permissions to the file system

2018-08-07 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.2 a5624c7ae -> 53ac8504b [SPARK-24948][SHS][BACKPORT-2.2] Delegate check access permissions to the file system ## What changes were proposed in this pull request? In `SparkHadoopUtil. checkAccessPermission`, we consider only basic

[1/2] spark git commit: Preparing Spark release v2.3.2-rc5

2018-08-13 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 787790b3c -> 29a040361 Preparing Spark release v2.3.2-rc5 Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4dc82259 Tree:

[spark] Git Push Summary

2018-08-13 Thread jshao
Repository: spark Updated Tags: refs/tags/v2.3.2-rc5 [created] 4dc82259d - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org

[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

2018-08-13 Thread jshao
Preparing development version 2.3.3-SNAPSHOT Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/29a04036 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/29a04036 Diff:

svn commit: r28707 - in /dev/spark/v2.3.2-rc5-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

2018-08-14 Thread jshao
Author: jshao Date: Tue Aug 14 06:54:52 2018 New Revision: 28707 Log: Apache Spark v2.3.2-rc5 docs [This commit notification would consist of 1446 parts, which exceeds the limit of 50 ones, so it was shortened to the summary

[spark] Git Push Summary

2018-08-09 Thread jshao
Repository: spark Updated Tags: refs/tags/v2.3.2-rc4 [created] 6930f4885 - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org

[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

2018-08-09 Thread jshao
Preparing development version 2.3.3-SNAPSHOT Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e66f3f9b Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/e66f3f9b Diff:

[1/2] spark git commit: Preparing Spark release v2.3.2-rc4

2018-08-09 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 b426ec583 -> e66f3f9b1 Preparing Spark release v2.3.2-rc4 Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/6930f488 Tree:

svn commit: r28647 - /dev/spark/v2.3.2-rc4-bin/

2018-08-09 Thread jshao
Author: jshao Date: Fri Aug 10 04:58:55 2018 New Revision: 28647 Log: Apache Spark v2.3.2-rc4 Added: dev/spark/v2.3.2-rc4-bin/ dev/spark/v2.3.2-rc4-bin/SparkR_2.3.2.tar.gz (with props) dev/spark/v2.3.2-rc4-bin/SparkR_2.3.2.tar.gz.asc dev/spark/v2.3.2-rc4-bin/SparkR_2.3.2

svn commit: r28649 - in /dev/spark/v2.3.2-rc4-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

2018-08-09 Thread jshao
Author: jshao Date: Fri Aug 10 05:50:52 2018 New Revision: 28649 Log: Apache Spark v2.3.2-rc4 docs [This commit notification would consist of 1446 parts, which exceeds the limit of 50 ones, so it was shortened to the summary

svn commit: r28702 - /dev/spark/v2.3.2-rc5-bin/

2018-08-13 Thread jshao
Author: jshao Date: Tue Aug 14 04:02:50 2018 New Revision: 28702 Log: Apache Spark v2.3.2-rc5 Added: dev/spark/v2.3.2-rc5-bin/ dev/spark/v2.3.2-rc5-bin/SparkR_2.3.2.tar.gz (with props) dev/spark/v2.3.2-rc5-bin/SparkR_2.3.2.tar.gz.asc dev/spark/v2.3.2-rc5-bin/SparkR_2.3.2

spark git commit: [SPARK-24307][CORE] Support reading remote cached partitions > 2gb

2018-07-19 Thread jshao
Repository: spark Updated Branches: refs/heads/master 67e108daa -> 7e847646d [SPARK-24307][CORE] Support reading remote cached partitions > 2gb (1) Netty's ByteBuf cannot support data > 2gb. So to transfer data from a ChunkedByteBuffer over the network, we use a custom version of FileRegion

spark git commit: [SPARK-24195][CORE] Ignore the files with "local" scheme in SparkContext.addFile

2018-07-19 Thread jshao
Repository: spark Updated Branches: refs/heads/master 7e847646d -> 7db81ac8a [SPARK-24195][CORE] Ignore the files with "local" scheme in SparkContext.addFile ## What changes were proposed in this pull request? In Spark "local" scheme means resources are already on the driver/executor nodes,

svn commit: r28123 - in /dev/spark/v2.3.2-rc3-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

2018-07-15 Thread jshao
Author: jshao Date: Sun Jul 15 07:30:18 2018 New Revision: 28123 Log: Apache Spark v2.3.2-rc3 docs [This commit notification would consist of 1446 parts, which exceeds the limit of 50 ones, so it was shortened to the summary

svn commit: r28118 - /dev/spark/v2.3.2-rc3-bin/

2018-07-14 Thread jshao
Author: jshao Date: Sun Jul 15 03:04:30 2018 New Revision: 28118 Log: Apache Spark v2.3.2-rc3 Added: dev/spark/v2.3.2-rc3-bin/ dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz (with props) dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2.tar.gz.asc dev/spark/v2.3.2-rc3-bin/SparkR_2.3.2

[1/2] spark git commit: Preparing Spark release v2.3.2-rc3

2018-07-14 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 9cf375f5b -> f9a2b0a87 Preparing Spark release v2.3.2-rc3 Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/b3726dad Tree:

[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

2018-07-14 Thread jshao
Preparing development version 2.3.3-SNAPSHOT Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f9a2b0a8 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f9a2b0a8 Diff:

[spark] Git Push Summary

2018-07-14 Thread jshao
Repository: spark Updated Tags: refs/tags/v2.3.2-rc3 [created] b3726dadc - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org

spark git commit: [SPARK-24594][YARN] Introducing metrics for YARN

2018-07-23 Thread jshao
Repository: spark Updated Branches: refs/heads/master cfc3e1aaa -> d2436a852 [SPARK-24594][YARN] Introducing metrics for YARN ## What changes were proposed in this pull request? In this PR metrics are introduced for YARN. As up to now there was no metrics in the YARN module a new metric

spark git commit: [SPARK-24297][CORE] Fetch-to-disk by default for > 2gb

2018-07-24 Thread jshao
Repository: spark Updated Branches: refs/heads/master 3efdf3532 -> 15fff7903 [SPARK-24297][CORE] Fetch-to-disk by default for > 2gb Fetch-to-mem is guaranteed to fail if the message is bigger than 2 GB, so we might as well use fetch-to-disk in that case. The message includes some metadata in

spark git commit: [SPARK-25183][SQL] Spark HiveServer2 to use Spark ShutdownHookManager

2018-08-31 Thread jshao
Repository: spark Updated Branches: refs/heads/master aa70a0a1a -> 515708d5f [SPARK-25183][SQL] Spark HiveServer2 to use Spark ShutdownHookManager ## What changes were proposed in this pull request? Switch `org.apache.hive.service.server.HiveServer2` to register its shutdown callback with

spark git commit: [SPARK-24646][CORE] Minor change to spark.yarn.dist.forceDownloadSchemes to support wildcard '*'

2018-07-08 Thread jshao
Repository: spark Updated Branches: refs/heads/master 79c668942 -> e2c7e09f7 [SPARK-24646][CORE] Minor change to spark.yarn.dist.forceDownloadSchemes to support wildcard '*' ## What changes were proposed in this pull request? In the case of getting tokens via customized

spark git commit: [SPARK-24678][SPARK-STREAMING] Give priority in use of 'PROCESS_LOCAL' for spark-streaming

2018-07-10 Thread jshao
Repository: spark Updated Branches: refs/heads/master a28900956 -> 6fe32869c [SPARK-24678][SPARK-STREAMING] Give priority in use of 'PROCESS_LOCAL' for spark-streaming ## What changes were proposed in this pull request? Currently, `BlockRDD.getPreferredLocations` only get hosts info of

[1/2] spark git commit: Preparing Spark release v2.3.2-rc2

2018-07-10 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 19542f5de -> 86457a16d Preparing Spark release v2.3.2-rc2 Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/307499e1 Tree:

[spark] Git Push Summary

2018-07-10 Thread jshao
Repository: spark Updated Tags: refs/tags/v2.3.2-rc2 [created] 307499e1a - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org

[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

2018-07-10 Thread jshao
Preparing development version 2.3.3-SNAPSHOT Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/86457a16 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/86457a16 Diff:

[1/2] spark git commit: Preparing Spark release v2.3.2-rc1

2018-07-07 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 64c72b4de -> 72eb97ce9 Preparing Spark release v2.3.2-rc1 Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4df06b45 Tree:

[spark] Git Push Summary

2018-07-07 Thread jshao
Repository: spark Updated Tags: refs/tags/v2.3.2-rc1 [created] 4df06b451 - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org

[2/2] spark git commit: Preparing development version 2.3.3-SNAPSHOT

2018-07-07 Thread jshao
Preparing development version 2.3.3-SNAPSHOT Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/72eb97ce Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/72eb97ce Diff:

svn commit: r27981 - /dev/spark/v2.3.2-rc1-bin/

2018-07-07 Thread jshao
Author: jshao Date: Sun Jul 8 04:41:12 2018 New Revision: 27981 Log: Apache Spark 2.3.2 Added: dev/spark/v2.3.2-rc1-bin/ dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz (with props) dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2.tar.gz.asc dev/spark/v2.3.2-rc1-bin/SparkR_2.3.2

svn commit: r27983 - in /dev/spark/v2.3.2-rc1-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _site/api/java/org/apache/spark

2018-07-08 Thread jshao
Author: jshao Date: Sun Jul 8 07:45:04 2018 New Revision: 27983 Log: Apache Spark v2.3.2-rc1 docs [This commit notification would consist of 1446 parts, which exceeds the limit of 50 ones, so it was shortened to the summary

spark git commit: [SPARK-23644][CORE][UI] Use absolute path for REST call in SHS

2018-03-16 Thread jshao
Repository: spark Updated Branches: refs/heads/master c2632edeb -> ca83526de [SPARK-23644][CORE][UI] Use absolute path for REST call in SHS ## What changes were proposed in this pull request? SHS is using a relative path for the REST API call to get the list of the application is a relative

spark git commit: [SPARK-23635][YARN] AM env variable should not overwrite same name env variable set through spark.executorEnv.

2018-03-16 Thread jshao
Repository: spark Updated Branches: refs/heads/master ca83526de -> c95200048 [SPARK-23635][YARN] AM env variable should not overwrite same name env variable set through spark.executorEnv. ## What changes were proposed in this pull request? In the current Spark on YARN code, AM always will

spark git commit: [MINOR][YARN] Add disable yarn.nodemanager.vmem-check-enabled option to memLimitExceededLogMessage

2018-03-07 Thread jshao
Repository: spark Updated Branches: refs/heads/master 4c587eb48 -> 04e71c316 [MINOR][YARN] Add disable yarn.nodemanager.vmem-check-enabled option to memLimitExceededLogMessage My spark application sometimes will throw `Container killed by YARN for exceeding memory limits`. Even I increased

spark git commit: [SPARK-23956][YARN] Use effective RPC port in AM registration

2018-04-15 Thread jshao
Repository: spark Updated Branches: refs/heads/master 73f28530d -> c0964935d [SPARK-23956][YARN] Use effective RPC port in AM registration ## What changes were proposed in this pull request? We propose not to hard-code the RPC port in the AM registration. ## How was this patch tested?

spark git commit: [SPARK-24014][PYSPARK] Add onStreamingStarted method to StreamingListener

2018-04-18 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 130641102 -> 32bec6ca3 [SPARK-24014][PYSPARK] Add onStreamingStarted method to StreamingListener ## What changes were proposed in this pull request? The `StreamingListener` in PySpark side seems to be lack of `onStreamingStarted`

spark git commit: [SPARK-24014][PYSPARK] Add onStreamingStarted method to StreamingListener

2018-04-18 Thread jshao
Repository: spark Updated Branches: refs/heads/master 0c94e48bc -> 8bb0df2c6 [SPARK-24014][PYSPARK] Add onStreamingStarted method to StreamingListener ## What changes were proposed in this pull request? The `StreamingListener` in PySpark side seems to be lack of `onStreamingStarted` method.

spark git commit: [SPARK-23787][TESTS] Fix file download test in SparkSubmitSuite for Hadoop 2.9.

2018-03-26 Thread jshao
Repository: spark Updated Branches: refs/heads/master 087fb3142 -> eb48edf9c [SPARK-23787][TESTS] Fix file download test in SparkSubmitSuite for Hadoop 2.9. This particular test assumed that Hadoop libraries did not support http as a file system. Hadoop 2.9 does, so the test failed. The test

spark git commit: [SPARK-23743][SQL] Changed a comparison logic from containing 'slf4j' to starting with 'org.slf4j'

2018-03-30 Thread jshao
Repository: spark Updated Branches: refs/heads/master b34890119 -> df05fb63a [SPARK-23743][SQL] Changed a comparison logic from containing 'slf4j' to starting with 'org.slf4j' ## What changes were proposed in this pull request? isSharedClass returns if some classes can/should be shared or

spark git commit: [SPARK-23361][YARN] Allow AM to restart after initial tokens expire.

2018-03-22 Thread jshao
Repository: spark Updated Branches: refs/heads/master b2edc30db -> 5fa438471 [SPARK-23361][YARN] Allow AM to restart after initial tokens expire. Currently, the Spark AM relies on the initial set of tokens created by the submission client to be able to talk to HDFS and other services that

spark git commit: [SPARK-23644][CORE][UI][BACKPORT-2.3] Use absolute path for REST call in SHS

2018-03-19 Thread jshao
Repository: spark Updated Branches: refs/heads/branch-2.3 5c1c03d08 -> 2f82c037d [SPARK-23644][CORE][UI][BACKPORT-2.3] Use absolute path for REST call in SHS ## What changes were proposed in this pull request? SHS is using a relative path for the REST API call to get the list of the

spark git commit: [SPARK-23708][CORE] Correct comment for function addShutDownHook in ShutdownHookManager

2018-03-18 Thread jshao
Repository: spark Updated Branches: refs/heads/master 61487b308 -> 745c8c090 [SPARK-23708][CORE] Correct comment for function addShutDownHook in ShutdownHookManager ## What changes were proposed in this pull request? Minor modification.Comment below is not right. ``` /** * Adds a shutdown

[spark] Git Push Summary

2018-09-25 Thread jshao
Repository: spark Updated Tags: refs/tags/v2.3.2 [created] 02b510728 - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org

[45/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/cov.html -- diff --git a/site/docs/2.3.2/api/R/cov.html b/site/docs/2.3.2/api/R/cov.html new file mode 100644 index 000..ec96abb ---

spark-website git commit: Empty commit to trigger asf to github sync

2018-09-26 Thread jshao
Repository: spark-website Updated Branches: refs/heads/asf-site 04a27dbf1 -> 546f35143 Empty commit to trigger asf to github sync Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/546f3514 Tree:

[42/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/match.html -- diff --git a/site/docs/2.3.2/api/R/match.html b/site/docs/2.3.2/api/R/match.html new file mode 100644 index 000..d405b90

[20/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/SimpleFutureAction.html -- diff --git a/site/docs/2.3.2/api/java/org/apache/spark/SimpleFutureAction.html

[01/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
Repository: spark-website Updated Branches: refs/heads/asf-site 806a1bd52 -> 04a27dbf1 http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/r/PairwiseRRDD.html --

[04/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/function/ForeachFunction.html -- diff --git

[06/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaSparkStatusTracker.html -- diff --git

[36/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/R/write.jdbc.html -- diff --git a/site/docs/2.3.2/api/R/write.jdbc.html b/site/docs/2.3.2/api/R/write.jdbc.html new file mode 100644 index

[33/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/constant-values.html -- diff --git a/site/docs/2.3.2/api/java/constant-values.html b/site/docs/2.3.2/api/java/constant-values.html new

[08/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDDLike.html -- diff --git a/site/docs/2.3.2/api/java/org/apache/spark/api/java/JavaRDDLike.html

[15/51] [partial] spark-website git commit: Add docs for Spark 2.3.2

2018-09-26 Thread jshao
http://git-wip-us.apache.org/repos/asf/spark-website/blob/04a27dbf/site/docs/2.3.2/api/java/org/apache/spark/TaskCommitDenied.html -- diff --git a/site/docs/2.3.2/api/java/org/apache/spark/TaskCommitDenied.html

  1   2   >