GitHub user deeppark opened a pull request:
https://github.com/apache/spark/pull/19455
Branch 2.0
## What changes were proposed in this pull request?
(Please fill in changes proposed in this fix)
## How was this patch tested?
(Please explain how this patch was tested. E.g. unit tests, integration
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise,
remove this)
Please review http://spark.apache.org/contributing.html before opening a
pull request.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/apache/spark branch-2.0
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/19455.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #19455
commit 5ec3e6680a091883369c002ae599d6b03f38c863
Author: Ergin Seyfe
Date: 2016-10-11T19:51:08Z
[SPARK-17816][CORE][BRANCH-2.0] Fix ConcurrentModificationException issue
in BlockStatusesAccumulator
## What changes were proposed in this pull request?
Replaced `BlockStatusesAccumulator` with `CollectionAccumulator` which is
thread safe and few more cleanups.
## How was this patch tested?
Tested in master branch and cherry-picked.
Author: Ergin Seyfe
Closes #15425 from seyfe/race_cond_jsonprotocal_branch-2.0.
commit e68e95e947045704d3e6a36bb31e104a99d3adcc
Author: Alexander Pivovarov
Date: 2016-10-12T05:31:21Z
Fix hadoop.version in building-spark.md
Couple of mvn build examples use `-Dhadoop.version=VERSION` instead of
actual version number
Author: Alexander Pivovarov
Closes #15440 from apivovarov/patch-1.
(cherry picked from commit 299eb04ba05038c7dbb3ecf74a35d4bbfa456643)
Signed-off-by: Reynold Xin
commit f3d82b53c42a971deedc04de6950b9228e5262ea
Author: Kousuke Saruta
Date: 2016-10-12T05:36:57Z
[SPARK-17880][DOC] The url linking to `AccumulatorV2` in the document is
incorrect.
## What changes were proposed in this pull request?
In `programming-guide.md`, the url which links to `AccumulatorV2` says
`api/scala/index.html#org.apache.spark.AccumulatorV2` but
`api/scala/index.html#org.apache.spark.util.AccumulatorV2` is correct.
## How was this patch tested?
manual test.
Author: Kousuke Saruta
Closes #15439 from sarutak/SPARK-17880.
(cherry picked from commit b512f04f8e546843d5a3f35dcc6b675b5f4f5bc0)
Signed-off-by: Reynold Xin
commit f12b74c02eec9e201fec8a16dac1f8e549c1b4f0
Author: cody koeninger
Date: 2016-10-12T07:40:47Z
[SPARK-17853][STREAMING][KAFKA][DOC] make it clear that reusing group.id is
bad
## What changes were proposed in this pull request?
Documentation fix to make it clear that reusing group id for different
streams is super duper bad, just like it is with the underlying Kafka consumer.
## How was this patch tested?
I built jekyll doc and made sure it looked ok.
Author: cody koeninger
Closes #15442 from koeninger/SPARK-17853.
(cherry picked from commit c264ef9b1918256a5018c7a42a1a2b42308ea3f7)
Signed-off-by: Reynold Xin
commit 4dcbde48de6c46e2fd8ccfec732b8ff5c24f97a4
Author: Bryan Cutler
Date: 2016-10-11T06:29:52Z
[SPARK-17808][PYSPARK] Upgraded version of Pyrolite to 4.13
## What changes were proposed in this pull request?
Upgraded to a newer version of Pyrolite which supports serialization of a
BinaryType StructField for PySpark.SQL
## How was this patch tested?
Added a unit test which fails with a raised ValueError when using the
previous version of Pyrolite 4.9 and Python3
Author: Bryan Cutler
Closes #15386 from BryanCutler/pyrolite-upgrade-SPARK-17808.
(cherry picked from commit 658c7147f5bf637f36e8c66b9207d94b1e7c74c5)
Signed-off-by: Sean Owen
commit 5451541d1113aa75bab80914ca51a913f6ba4753
Author: prigarg
Date: 2016-10-12T17:14:45Z
[SPARK-17884][SQL] To resolve Null pointer exception when casting from
empty string to interval type.
## What changes were proposed in this pull request?
This change adds a check in castToInterval method of Cast expression , such
that if converted value is null , then isNull variable should be set to true.
Earlier, the expression Cast(Literal(), CalendarIntervalType) was throwing
NullPointerException because of the above mentioned reason.
## How was