spark git commit: [SPARK-9119] [SPARK-8359] [SQL] match Decimal.precision/scale with DecimalType

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/master d34548587 - 781c8d71a [SPARK-9119] [SPARK-8359] [SQL] match Decimal.precision/scale with DecimalType Let Decimal carry the correct precision and scale with DecimalType. cc rxin yhuai Author: Davies Liu dav...@databricks.com Closes #7925

spark git commit: [SPARK-9119] [SPARK-8359] [SQL] match Decimal.precision/scale with DecimalType

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/branch-1.5 28bb97730 - 864d5de6d [SPARK-9119] [SPARK-8359] [SQL] match Decimal.precision/scale with DecimalType Let Decimal carry the correct precision and scale with DecimalType. cc rxin yhuai Author: Davies Liu dav...@databricks.com Closes

[1/2] spark git commit: Add a prerequisites section for building docs

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 864d5de6d - b6e8446a4 Add a prerequisites section for building docs This puts all the install commands that need to be run in one section instead of being spread over many paragraphs cc rxin Author: Shivaram Venkataraman

[2/2] spark git commit: Update docs/README.md to put all prereqs together.

2015-08-05 Thread rxin
Update docs/README.md to put all prereqs together. This pull request groups all the prereq requirements into a single section. cc srowen shivaram Author: Reynold Xin r...@databricks.com Closes #7951 from rxin/readme-docs and squashes the following commits: ab7ded0 [Reynold Xin] Updated

spark git commit: [SPARK-9217] [STREAMING] Make the kinesis receiver reliable by recording sequence numbers

2015-08-05 Thread tdas
Repository: spark Updated Branches: refs/heads/branch-1.5 b6e8446a4 - ea23e54ff [SPARK-9217] [STREAMING] Make the kinesis receiver reliable by recording sequence numbers This PR is the second one in the larger issue of making the Kinesis integration reliable and provide WAL-free at-least

spark git commit: [SPARK-9217] [STREAMING] Make the kinesis receiver reliable by recording sequence numbers

2015-08-05 Thread tdas
Repository: spark Updated Branches: refs/heads/master 781c8d71a - c2a71f071 [SPARK-9217] [STREAMING] Make the kinesis receiver reliable by recording sequence numbers This PR is the second one in the larger issue of making the Kinesis integration reliable and provide WAL-free at-least once

spark git commit: [SPARK-9581][SQL] Add unit test for JSON UDT

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master c2a71f071 - 1d1a76c8c [SPARK-9581][SQL] Add unit test for JSON UDT This brings #7416 up-to-date by drubbo. Author: Emiliano Leporati emiliano.lepor...@gmail.com Author: Reynold Xin r...@databricks.com Closes #7917 from rxin/udt-json-test

spark git commit: [SPARK-9581][SQL] Add unit test for JSON UDT

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 ea23e54ff - 57596fb8c [SPARK-9581][SQL] Add unit test for JSON UDT This brings #7416 up-to-date by drubbo. Author: Emiliano Leporati emiliano.lepor...@gmail.com Author: Reynold Xin r...@databricks.com Closes #7917 from

spark git commit: Closes #7917

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 1d1a76c8c - d8ef538e5 Closes #7917 Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/d8ef538e Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/d8ef538e Diff:

spark git commit: [SPARK-9360] [SQL] Support BinaryType in PrefixComparators for UnsafeExternalSort

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/master d8ef538e5 - 6d8a6e416 [SPARK-9360] [SQL] Support BinaryType in PrefixComparators for UnsafeExternalSort The current implementation of UnsafeExternalSort uses NoOpPrefixComparator for binary-typed data. So, we need to add

spark git commit: [SPARK-9360] [SQL] Support BinaryType in PrefixComparators for UnsafeExternalSort

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/branch-1.5 57596fb8c - 7fa419535 [SPARK-9360] [SQL] Support BinaryType in PrefixComparators for UnsafeExternalSort The current implementation of UnsafeExternalSort uses NoOpPrefixComparator for binary-typed data. So, we need to add

spark git commit: [SPARK-9601] [DOCS] Fix JavaPairDStream signature for stream-stream and windowed join in streaming guide doc

2015-08-05 Thread tdas
Repository: spark Updated Branches: refs/heads/master 6d8a6e416 - 1bf608b5e [SPARK-9601] [DOCS] Fix JavaPairDStream signature for stream-stream and windowed join in streaming guide doc Author: Namit Katariya katariya.na...@gmail.com Closes #7935 from namitk/SPARK-9601 and squashes the

spark git commit: [SPARK-9601] [DOCS] Fix JavaPairDStream signature for stream-stream and windowed join in streaming guide doc

2015-08-05 Thread tdas
Repository: spark Updated Branches: refs/heads/branch-1.5 7fa419535 - 6306019ff [SPARK-9601] [DOCS] Fix JavaPairDStream signature for stream-stream and windowed join in streaming guide doc Author: Namit Katariya katariya.na...@gmail.com Closes #7935 from namitk/SPARK-9601 and squashes the

[1/2] spark git commit: [SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 1bf608b5e - 1b0317f64 http://git-wip-us.apache.org/repos/asf/spark/blob/1b0317f6/sql/core/src/test/scala/org/apache/spark/sql/ui/SQLListenerSuite.scala -- diff --git

[2/2] spark git commit: [SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab

2015-08-05 Thread rxin
[SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab This PR includes the following changes: ### SPARK-8862: Add basic instrumentation to each SparkPlan operator A SparkPlan can override `def accumulators: Map[String, Accumulator[_]]` to

[2/2] spark git commit: [SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab

2015-08-05 Thread rxin
[SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab This PR includes the following changes: ### SPARK-8862: Add basic instrumentation to each SparkPlan operator A SparkPlan can override `def accumulators: Map[String, Accumulator[_]]` to

spark git commit: [SPARK-9628][SQL]Rename int to SQLDate, long to SQLTimestamp for better readability

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 ebc3aad27 - f288cca3d [SPARK-9628][SQL]Rename int to SQLDate, long to SQLTimestamp for better readability JIRA: https://issues.apache.org/jira/browse/SPARK-9628 Author: Yijie Shen henry.yijies...@gmail.com Closes #7953 from

spark git commit: [SPARK-9628][SQL]Rename int to SQLDate, long to SQLTimestamp for better readability

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 1b0317f64 - 84ca3183b [SPARK-9628][SQL]Rename int to SQLDate, long to SQLTimestamp for better readability JIRA: https://issues.apache.org/jira/browse/SPARK-9628 Author: Yijie Shen henry.yijies...@gmail.com Closes #7953 from

spark git commit: [HOTFIX] Add static import to fix build break from #7676.

2015-08-05 Thread joshrosen
Repository: spark Updated Branches: refs/heads/master 84ca3183b - 26b06f1c4 [HOTFIX] Add static import to fix build break from #7676. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/26b06f1c Tree:

spark git commit: [HOTFIX] Add static import to fix build break from #7676.

2015-08-05 Thread joshrosen
Repository: spark Updated Branches: refs/heads/branch-1.5 f288cca3d - 93c166a91 [HOTFIX] Add static import to fix build break from #7676. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/93c166a9 Tree:

spark git commit: [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn

2015-08-05 Thread srowen
Repository: spark Updated Branches: refs/heads/master 26b06f1c4 - e27a8c4cb [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn - pass `$ZINC_PORT` to zinc status/shutdown commands - fix path check that sets `$ZINC_INSTALL_FLAG`, which was incorrectly causing zinc to be shutdown

spark git commit: [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn

2015-08-05 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-1.5 93c166a91 - 350006497 [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn - pass `$ZINC_PORT` to zinc status/shutdown commands - fix path check that sets `$ZINC_INSTALL_FLAG`, which was incorrectly causing zinc to be

spark git commit: [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn

2015-08-05 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-1.3 cd5d1be6e - 384793dff [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn - pass `$ZINC_PORT` to zinc status/shutdown commands - fix path check that sets `$ZINC_INSTALL_FLAG`, which was incorrectly causing zinc to be

spark git commit: [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn

2015-08-05 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-1.4 dea04bf84 - 369510c5a [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn - pass `$ZINC_PORT` to zinc status/shutdown commands - fix path check that sets `$ZINC_INSTALL_FLAG`, which was incorrectly causing zinc to be

spark git commit: [SPARK-9593] [SQL] Fixes Hadoop shims loading

2015-08-05 Thread lian
Repository: spark Updated Branches: refs/heads/master e27a8c4cb - 70112ff22 [SPARK-9593] [SQL] Fixes Hadoop shims loading This PR is used to workaround CDH Hadoop versions like 2.0.0-mr1-cdh4.1.1. Internally, Hive `ShimLoader` tries to load different versions of Hadoop shims by checking

spark git commit: [SPARK-9618] [SQL] Use the specified schema when reading Parquet files

2015-08-05 Thread lian
Repository: spark Updated Branches: refs/heads/master 70112ff22 - eb8bfa3ea [SPARK-9618] [SQL] Use the specified schema when reading Parquet files The user specified schema is currently ignored when loading Parquet files. One workaround is to use the `format` and `load` methods instead of

spark git commit: [SPARK-9381] [SQL] Migrate JSON data source to the new partitioning data source

2015-08-05 Thread lian
Repository: spark Updated Branches: refs/heads/master eb8bfa3ea - 519cf6d3f [SPARK-9381] [SQL] Migrate JSON data source to the new partitioning data source Support partitioning for the JSON data source. Still 2 open issues for the `HadoopFsRelation` - `refresh()` will invoke the

spark git commit: [SPARK-6486] [MLLIB] [PYTHON] Add BlockMatrix to PySpark.

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.5 350006497 - eedb996dd [SPARK-6486] [MLLIB] [PYTHON] Add BlockMatrix to PySpark. mengxr This adds the `BlockMatrix` to PySpark. I have the conversions to `IndexedRowMatrix` and `CoordinateMatrix` ready as well, so once PR #7554 is

spark git commit: [SPARK-6486] [MLLIB] [PYTHON] Add BlockMatrix to PySpark.

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/master 519cf6d3f - 34dcf1010 [SPARK-6486] [MLLIB] [PYTHON] Add BlockMatrix to PySpark. mengxr This adds the `BlockMatrix` to PySpark. I have the conversions to `IndexedRowMatrix` and `CoordinateMatrix` ready as well, so once PR #7554 is

spark git commit: [SPARK-9141] [SQL] Remove project collapsing from DataFrame API

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/master 34dcf1010 - 23d982204 [SPARK-9141] [SQL] Remove project collapsing from DataFrame API Currently we collapse successive projections that are added by `withColumn`. However, this optimization violates the constraint that adding nodes to a

spark git commit: [SPARK-9141] [SQL] Remove project collapsing from DataFrame API

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.5 eedb996dd - 125827a4f [SPARK-9141] [SQL] Remove project collapsing from DataFrame API Currently we collapse successive projections that are added by `withColumn`. However, this optimization violates the constraint that adding nodes

spark git commit: [SPARK-9519] [YARN] Confirm stop sc successfully when application was killed

2015-08-05 Thread vanzin
Repository: spark Updated Branches: refs/heads/master 23d982204 - 7a969a696 [SPARK-9519] [YARN] Confirm stop sc successfully when application was killed Currently, when we kill application on Yarn, then will call sc.stop() at Yarn application state monitor thread, then in

spark git commit: [SPARK-9519] [YARN] Confirm stop sc successfully when application was killed

2015-08-05 Thread vanzin
Repository: spark Updated Branches: refs/heads/branch-1.5 125827a4f - 03bcf627d [SPARK-9519] [YARN] Confirm stop sc successfully when application was killed Currently, when we kill application on Yarn, then will call sc.stop() at Yarn application state monitor thread, then in

spark git commit: [SPARK-9141] [SQL] [MINOR] Fix comments of PR #7920

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/master 7a969a696 - 1f8c364b9 [SPARK-9141] [SQL] [MINOR] Fix comments of PR #7920 This is a follow-up of https://github.com/apache/spark/pull/7920 to fix comments. Author: Yin Huai yh...@databricks.com Closes #7964 from

spark git commit: [SPARK-9141] [SQL] [MINOR] Fix comments of PR #7920

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.5 03bcf627d - 19018d542 [SPARK-9141] [SQL] [MINOR] Fix comments of PR #7920 This is a follow-up of https://github.com/apache/spark/pull/7920 to fix comments. Author: Yin Huai yh...@databricks.com Closes #7964 from

spark git commit: [SPARK-9403] [SQL] Add codegen support in In and InSet

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/master 1f8c364b9 - e1e05873f [SPARK-9403] [SQL] Add codegen support in In and InSet This continues tarekauel's work in #7778. Author: Liang-Chi Hsieh vii...@appier.com Author: Tarek Auel tarek.a...@googlemail.com Closes #7893 from

spark git commit: [SPARK-9403] [SQL] Add codegen support in In and InSet

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/branch-1.5 19018d542 - b8136d7e0 [SPARK-9403] [SQL] Add codegen support in In and InSet This continues tarekauel's work in #7778. Author: Liang-Chi Hsieh vii...@appier.com Author: Tarek Auel tarek.a...@googlemail.com Closes #7893 from

spark git commit: Closes #7778 since it is done as #7893.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master e1e05873f - eb5b8f4a6 Closes #7778 since it is done as #7893. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/eb5b8f4a Tree:

spark git commit: [SPARK-9649] Fix flaky test MasterSuite - randomize ports

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/master eb5b8f4a6 - 5f0fb6466 [SPARK-9649] Fix flaky test MasterSuite - randomize ports ``` Error Message Failed to bind to: /127.0.0.1:7093: Service 'sparkMaster' failed after 16 retries! Stacktrace java.net.BindException: Failed to bind

spark git commit: [SPARK-9649] Fix flaky test MasterSuite - randomize ports

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.5 b8136d7e0 - 05cbf133d [SPARK-9649] Fix flaky test MasterSuite - randomize ports ``` Error Message Failed to bind to: /127.0.0.1:7093: Service 'sparkMaster' failed after 16 retries! Stacktrace java.net.BindException: Failed to

spark git commit: Closes #7474 since it's marked as won't fix.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 5f0fb6466 - f9c2a2af1 Closes #7474 since it's marked as won't fix. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f9c2a2af Tree:

spark git commit: [SPARK-9611] [SQL] Fixes a few corner cases when we spill a UnsafeFixedWidthAggregationMap

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.5 eb2229ac0 - f24cd8cb9 [SPARK-9611] [SQL] Fixes a few corner cases when we spill a UnsafeFixedWidthAggregationMap This PR has the following three small fixes. 1. UnsafeKVExternalSorter does not use 0 as the initialSize to create an

spark git commit: [SPARK-9611] [SQL] Fixes a few corner cases when we spill a UnsafeFixedWidthAggregationMap

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/master 4399b7b09 - 4581badbc [SPARK-9611] [SQL] Fixes a few corner cases when we spill a UnsafeFixedWidthAggregationMap This PR has the following three small fixes. 1. UnsafeKVExternalSorter does not use 0 as the initialSize to create an

spark git commit: [SPARK-9674][SQL] Remove GeneratedAggregate.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 119b59053 - 9270bd06f [SPARK-9674][SQL] Remove GeneratedAggregate. The new aggregate replaces the old GeneratedAggregate. Author: Reynold Xin r...@databricks.com Closes #7983 from rxin/remove-generated-agg and squashes the following

spark git commit: [SPARK-6923] [SPARK-7550] [SQL] Persists data source relations in Hive compatible format when possible

2015-08-05 Thread lian
Repository: spark Updated Branches: refs/heads/master 4581badbc - 119b59053 [SPARK-6923] [SPARK-7550] [SQL] Persists data source relations in Hive compatible format when possible This PR is a fork of PR #5733 authored by chenghao-intel. For committers who's going to merge this PR, please

spark git commit: [SPARK-9664] [SQL] Remove UDAFRegistration and add apply to UserDefinedAggregateFunction.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 9270bd06f - d5a9af323 [SPARK-9664] [SQL] Remove UDAFRegistration and add apply to UserDefinedAggregateFunction. https://issues.apache.org/jira/browse/SPARK-9664 Author: Yin Huai yh...@databricks.com Closes #7982 from yhuai/udafRegister

spark git commit: [SPARK-9674][SQL] Remove GeneratedAggregate.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 f24cd8cb9 - 252eb6193 [SPARK-9674][SQL] Remove GeneratedAggregate. The new aggregate replaces the old GeneratedAggregate. Author: Reynold Xin r...@databricks.com Closes #7983 from rxin/remove-generated-agg and squashes the following

spark git commit: [SPARK-9664] [SQL] Remove UDAFRegistration and add apply to UserDefinedAggregateFunction.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 252eb6193 - 29ace3bbf [SPARK-9664] [SQL] Remove UDAFRegistration and add apply to UserDefinedAggregateFunction. https://issues.apache.org/jira/browse/SPARK-9664 Author: Yin Huai yh...@databricks.com Closes #7982 from

spark git commit: [SPARK-9054] [SQL] Rename RowOrdering to InterpretedOrdering; use newOrdering in SMJ

2015-08-05 Thread joshrosen
Repository: spark Updated Branches: refs/heads/branch-1.5 30e9fcfb3 - 618dc63e7 [SPARK-9054] [SQL] Rename RowOrdering to InterpretedOrdering; use newOrdering in SMJ This patches renames `RowOrdering` to `InterpretedOrdering` and updates SortMergeJoin to use the `SparkPlan` methods for

spark git commit: [SPARK-9054] [SQL] Rename RowOrdering to InterpretedOrdering; use newOrdering in SMJ

2015-08-05 Thread joshrosen
Repository: spark Updated Branches: refs/heads/master dac090d1e - 9c878923d [SPARK-9054] [SQL] Rename RowOrdering to InterpretedOrdering; use newOrdering in SMJ This patches renames `RowOrdering` to `InterpretedOrdering` and updates SortMergeJoin to use the `SparkPlan` methods for

spark git commit: [SPARK-5895] [ML] Add VectorSlicer - updated

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/master 9c878923d - a018b8571 [SPARK-5895] [ML] Add VectorSlicer - updated Add VectorSlicer transformer to spark.ml, with features specified as either indices or names. Transfers feature attributes for selected features. Updated version of

spark git commit: [SPARK-9657] Fix return type of getMaxPatternLength

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/master f9c2a2af1 - dac090d1e [SPARK-9657] Fix return type of getMaxPatternLength mengxr Author: Feynman Liang fli...@databricks.com Closes #7974 from feynmanliang/SPARK-9657 and squashes the following commits: 7ca533f [Feynman Liang] Fix

spark git commit: [SPARK-9657] Fix return type of getMaxPatternLength

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.5 05cbf133d - 30e9fcfb3 [SPARK-9657] Fix return type of getMaxPatternLength mengxr Author: Feynman Liang fli...@databricks.com Closes #7974 from feynmanliang/SPARK-9657 and squashes the following commits: 7ca533f [Feynman Liang] Fix

spark git commit: [SPARK-5895] [ML] Add VectorSlicer - updated

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.5 618dc63e7 - 3b617e87c [SPARK-5895] [ML] Add VectorSlicer - updated Add VectorSlicer transformer to spark.ml, with features specified as either indices or names. Transfers feature attributes for selected features. Updated version of

spark git commit: [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/master a018b8571 - 8c320e45b [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings JIRA: https://issues.apache.org/jira/browse/SPARK-6591 Author: Yijie Shen henry.yijies...@gmail.com Closes #7926 from

spark git commit: [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/branch-1.5 3b617e87c - 5f037b3dc [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings JIRA: https://issues.apache.org/jira/browse/SPARK-6591 Author: Yijie Shen henry.yijies...@gmail.com Closes #7926

spark git commit: [SPARK-9651] Fix UnsafeExternalSorterSuite.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 8c320e45b - 4399b7b09 [SPARK-9651] Fix UnsafeExternalSorterSuite. First, it's probably a bad idea to call generated Scala methods from Java. In this case, the method being called wasn't actually Utils.createTempDir(), but actually the