spark git commit: [SPARK-22417][PYTHON][FOLLOWUP][BRANCH-2.2] Fix for createDataFrame from pandas.DataFrame with timestamp

2017-11-08 Thread ueshin
Repository: spark Updated Branches: refs/heads/branch-2.2 efaf73fcd -> 0e97c8eef [SPARK-22417][PYTHON][FOLLOWUP][BRANCH-2.2] Fix for createDataFrame from pandas.DataFrame with timestamp ## What changes were proposed in this pull request? This is a follow-up of #19646 for branch-2.2. The orig

spark git commit: [SPARK-22222][CORE][TEST][FOLLOW-UP] Remove redundant and deprecated `Timeouts`

2017-11-08 Thread gurwls223
Repository: spark Updated Branches: refs/heads/master 695647bf2 -> 98be55c0f [SPARK-2][CORE][TEST][FOLLOW-UP] Remove redundant and deprecated `Timeouts` ## What changes were proposed in this pull request? Since SPARK-21939, Apache Spark uses `TimeLimits` instead of the deprecated `Timeou

spark git commit: [SPARK-22211][SQL][FOLLOWUP] Fix bad merge for tests

2017-11-08 Thread lixiao
Repository: spark Updated Branches: refs/heads/branch-2.2 73a2ca06b -> efaf73fcd [SPARK-22211][SQL][FOLLOWUP] Fix bad merge for tests ## What changes were proposed in this pull request? The merge of SPARK-22211 to branch-2.2 dropped a couple of important lines that made sure the tests that c

spark git commit: [SPARK-21640][SQL][PYTHON][R][FOLLOWUP] Add errorifexists in SparkR and other documentations

2017-11-08 Thread gurwls223
Repository: spark Updated Branches: refs/heads/master d01044233 -> 695647bf2 [SPARK-21640][SQL][PYTHON][R][FOLLOWUP] Add errorifexists in SparkR and other documentations ## What changes were proposed in this pull request? This PR proposes to add `errorifexists` to SparkR API and fix the rest

spark git commit: [SPARK-22456][SQL] Add support for dayofweek function

2017-11-08 Thread gurwls223
Repository: spark Updated Branches: refs/heads/master ee571d79e -> d01044233 [SPARK-22456][SQL] Add support for dayofweek function ## What changes were proposed in this pull request? This PR adds support for a new function called `dayofweek` that returns the day of the week of the given argum

spark git commit: [SPARK-22466][SPARK SUBMIT] export SPARK_CONF_DIR while conf is default

2017-11-08 Thread gurwls223
Repository: spark Updated Branches: refs/heads/master 6447d7bc1 -> ee571d79e [SPARK-22466][SPARK SUBMIT] export SPARK_CONF_DIR while conf is default ## What changes were proposed in this pull request? We use SPARK_CONF_DIR to switch spark conf directory and can be visited if we explicitly e

spark git commit: [SPARK-22133][DOCS] Documentation for Mesos Reject Offer Configurations

2017-11-08 Thread srowen
Repository: spark Updated Branches: refs/heads/master 87343e155 -> 6447d7bc1 [SPARK-22133][DOCS] Documentation for Mesos Reject Offer Configurations ## What changes were proposed in this pull request? Documentation about Mesos Reject Offer Configurations ## Related PR https://github.com/apach

spark git commit: [SPARK-22446][SQL][ML] Declare StringIndexerModel indexer udf as nondeterministic

2017-11-08 Thread wenchen
Repository: spark Updated Branches: refs/heads/master 51debf8b1 -> 87343e155 [SPARK-22446][SQL][ML] Declare StringIndexerModel indexer udf as nondeterministic ## What changes were proposed in this pull request? UDFs that can cause runtime exception on invalid data are not safe to pushdown,

spark git commit: [SPARK-14540][BUILD] Support Scala 2.12 closures and Java 8 lambdas in ClosureCleaner (step 0)

2017-11-08 Thread srowen
Repository: spark Updated Branches: refs/heads/master 11eea1a4c -> 51debf8b1 [SPARK-14540][BUILD] Support Scala 2.12 closures and Java 8 lambdas in ClosureCleaner (step 0) ## What changes were proposed in this pull request? Preliminary changes to get ClosureCleaner to work with Scala 2.12. M