[spark] branch master updated (3cc55f6 -> eb50996)

2019-12-10 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3cc55f6  [SPARK-29392][CORE][SQL][FOLLOWUP] More removal of 'foo 
Symbol syntax for Scala 2.13
 add eb50996  [SPARK-30211][INFRA] Use python3 in make-distribution.sh

No new revisions were added by this update.

Summary of changes:
 dev/make-distribution.sh | 2 +-
 docs/building-spark.md   | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (3cc55f6 -> eb50996)

2019-12-10 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 3cc55f6  [SPARK-29392][CORE][SQL][FOLLOWUP] More removal of 'foo 
Symbol syntax for Scala 2.13
 add eb50996  [SPARK-30211][INFRA] Use python3 in make-distribution.sh

No new revisions were added by this update.

Summary of changes:
 dev/make-distribution.sh | 2 +-
 docs/building-spark.md   | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: [SPARK-29392][CORE][SQL][FOLLOWUP] More removal of 'foo Symbol syntax for Scala 2.13

2019-12-10 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new 3cc55f6  [SPARK-29392][CORE][SQL][FOLLOWUP] More removal of 'foo 
Symbol syntax for Scala 2.13
3cc55f6 is described below

commit 3cc55f6a0a560782f6e20296ac716ef68a412d26
Author: Sean Owen 
AuthorDate: Tue Dec 10 19:41:24 2019 -0800

[SPARK-29392][CORE][SQL][FOLLOWUP] More removal of 'foo Symbol syntax for 
Scala 2.13

### What changes were proposed in this pull request?

Another continuation of https://github.com/apache/spark/pull/26748

### Why are the changes needed?

To cleanly cross compile with Scala 2.13.

### Does this PR introduce any user-facing change?

None.

### How was this patch tested?

Existing tests

Closes #26842 from srowen/SPARK-29392.4.

Authored-by: Sean Owen 
Signed-off-by: Dongjoon Hyun 
---
 .../spark/sql/catalyst/analysis/DSLHintSuite.scala |   6 +-
 .../analysis/ExpressionTypeCheckingSuite.scala | 193 +++--
 .../spark/sql/DataFrameWindowFramesSuite.scala |   4 +-
 .../spark/sql/DataFrameWindowFunctionsSuite.scala  |  22 +--
 .../org/apache/spark/sql/DatasetCacheSuite.scala   |  18 +-
 .../scala/org/apache/spark/sql/DatasetSuite.scala  |  14 +-
 .../spark/sql/DynamicPartitionPruningSuite.scala   |   5 +-
 .../apache/spark/sql/GeneratorFunctionSuite.scala  |  37 ++--
 .../scala/org/apache/spark/sql/JoinHintSuite.scala |  22 +--
 9 files changed, 168 insertions(+), 153 deletions(-)

diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/DSLHintSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/DSLHintSuite.scala
index 388eb23..c316e04 100644
--- 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/DSLHintSuite.scala
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/DSLHintSuite.scala
@@ -22,9 +22,9 @@ import org.apache.spark.sql.catalyst.dsl.plans._
 import org.apache.spark.sql.catalyst.plans.logical._
 
 class DSLHintSuite extends AnalysisTest {
-  lazy val a = 'a.int
-  lazy val b = 'b.string
-  lazy val c = 'c.string
+  lazy val a = Symbol("a").int
+  lazy val b = Symbol("b").string
+  lazy val c = Symbol("c").string
   lazy val r1 = LocalRelation(a, b, c)
 
   test("various hint parameters") {
diff --git 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ExpressionTypeCheckingSuite.scala
 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ExpressionTypeCheckingSuite.scala
index c83759e..f944b4a 100644
--- 
a/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ExpressionTypeCheckingSuite.scala
+++ 
b/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ExpressionTypeCheckingSuite.scala
@@ -29,12 +29,12 @@ import org.apache.spark.sql.types._
 class ExpressionTypeCheckingSuite extends SparkFunSuite {
 
   val testRelation = LocalRelation(
-'intField.int,
-'stringField.string,
-'booleanField.boolean,
-'decimalField.decimal(8, 0),
-'arrayField.array(StringType),
-'mapField.map(StringType, LongType))
+Symbol("intField").int,
+Symbol("stringField").string,
+Symbol("booleanField").boolean,
+Symbol("decimalField").decimal(8, 0),
+Symbol("arrayField").array(StringType),
+Symbol("mapField").map(StringType, LongType))
 
   def assertError(expr: Expression, errorMessage: String): Unit = {
 val e = intercept[AnalysisException] {
@@ -56,83 +56,92 @@ class ExpressionTypeCheckingSuite extends SparkFunSuite {
   }
 
   test("check types for unary arithmetic") {
-assertError(BitwiseNot('stringField), "requires integral type")
+assertError(BitwiseNot(Symbol("stringField")), "requires integral type")
   }
 
   test("check types for binary arithmetic") {
 // We will cast String to Double for binary arithmetic
-assertSuccess(Add('intField, 'stringField))
-assertSuccess(Subtract('intField, 'stringField))
-assertSuccess(Multiply('intField, 'stringField))
-assertSuccess(Divide('intField, 'stringField))
-assertSuccess(Remainder('intField, 'stringField))
-// checkAnalysis(BitwiseAnd('intField, 'stringField))
-
-assertErrorForDifferingTypes(Add('intField, 'booleanField))
-assertErrorForDifferingTypes(Subtract('intField, 'booleanField))
-assertErrorForDifferingTypes(Multiply('intField, 'booleanField))
-assertErrorForDifferingTypes(Divide('intField, 'booleanField))
-assertErrorForDifferingTypes(Remainder('intField, 'booleanField))
-assertErrorForDifferingTypes(BitwiseAnd('intField, 'booleanField))
-assertErrorForDifferingTypes(BitwiseOr('intField, 'booleanField))
-assertErrorForDifferingTypes(BitwiseXor('intField, 'booleanField))
-
-assertError(Add('booleanField, 

[spark] branch branch-2.4 updated (0884766 -> 74d8cf1)

2019-12-10 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 0884766  [SPARK-30163][INFRA][FOLLOWUP] Make `.m2` directory for cold 
start without cache
 add 74d8cf1  [SPARK-29152][2.4][CORE] Executor Plugin shutdown when 
dynamic allocation is enabled

No new revisions were added by this update.

Summary of changes:
 .../scala/org/apache/spark/executor/Executor.scala | 39 +-
 1 file changed, 23 insertions(+), 16 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (cfd7ca9 -> d7843dd)

2019-12-10 Thread vanzin
This is an automated email from the ASF dual-hosted git repository.

vanzin pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from cfd7ca9  Revert "[SPARK-21869][SS] Apply Apache Commons Pool to Kafka 
producer"
 add d7843dd  [SPARK-29152][CORE] Executor Plugin shutdown when dynamic 
allocation is enabled

No new revisions were added by this update.

Summary of changes:
 .../scala/org/apache/spark/executor/Executor.scala | 47 +-
 1 file changed, 27 insertions(+), 20 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (cfd7ca9 -> d7843dd)

2019-12-10 Thread vanzin
This is an automated email from the ASF dual-hosted git repository.

vanzin pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from cfd7ca9  Revert "[SPARK-21869][SS] Apply Apache Commons Pool to Kafka 
producer"
 add d7843dd  [SPARK-29152][CORE] Executor Plugin shutdown when dynamic 
allocation is enabled

No new revisions were added by this update.

Summary of changes:
 .../scala/org/apache/spark/executor/Executor.scala | 47 +-
 1 file changed, 27 insertions(+), 20 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated: Revert "[SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer"

2019-12-10 Thread zsxwing
This is an automated email from the ASF dual-hosted git repository.

zsxwing pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new cfd7ca9  Revert "[SPARK-21869][SS] Apply Apache Commons Pool to Kafka 
producer"
cfd7ca9 is described below

commit cfd7ca9a06161f7622b5179a777f965c11892afa
Author: Shixiong Zhu 
AuthorDate: Tue Dec 10 11:21:46 2019 -0800

Revert "[SPARK-21869][SS] Apply Apache Commons Pool to Kafka producer"

This reverts commit 3641c3dd69b2bd2beae028d52356450cc41f69ed.
---
 .../spark/sql/kafka010/CachedKafkaProducer.scala   | 118 +++-
 .../sql/kafka010/InternalKafkaConnectorPool.scala  | 210 -
 .../sql/kafka010/InternalKafkaConsumerPool.scala   | 210 ++---
 .../sql/kafka010/InternalKafkaProducerPool.scala   |  68 ---
 .../spark/sql/kafka010/KafkaDataConsumer.scala |   7 +-
 .../spark/sql/kafka010/KafkaDataWriter.scala   |  34 +---
 .../apache/spark/sql/kafka010/KafkaWriteTask.scala |  20 +-
 .../org/apache/spark/sql/kafka010/package.scala|  34 +---
 .../sql/kafka010/CachedKafkaProducerSuite.scala| 154 ---
 scala => InternalKafkaConsumerPoolSuite.scala} |   8 +-
 .../sql/kafka010/KafkaDataConsumerSuite.scala  |   6 +-
 .../org/apache/spark/sql/kafka010/KafkaTest.scala  |  10 +-
 .../kafka010/KafkaDataConsumerSuite.scala  |   7 +
 13 files changed, 332 insertions(+), 554 deletions(-)

diff --git 
a/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/CachedKafkaProducer.scala
 
b/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/CachedKafkaProducer.scala
index 907440a..fc177cd 100644
--- 
a/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/CachedKafkaProducer.scala
+++ 
b/external/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/CachedKafkaProducer.scala
@@ -18,68 +18,60 @@
 package org.apache.spark.sql.kafka010
 
 import java.{util => ju}
-import java.io.Closeable
+import java.util.concurrent.{ConcurrentMap, ExecutionException, TimeUnit}
 
+import com.google.common.cache._
+import com.google.common.util.concurrent.{ExecutionError, 
UncheckedExecutionException}
+import org.apache.kafka.clients.producer.KafkaProducer
 import scala.collection.JavaConverters._
 import scala.util.control.NonFatal
 
-import org.apache.kafka.clients.producer.{Callback, KafkaProducer, 
ProducerRecord}
-
 import org.apache.spark.SparkEnv
 import org.apache.spark.internal.Logging
 import org.apache.spark.kafka010.{KafkaConfigUpdater, KafkaRedactionUtil}
-import org.apache.spark.sql.kafka010.InternalKafkaProducerPool._
-import org.apache.spark.util.ShutdownHookManager
 
-private[kafka010] class CachedKafkaProducer(val kafkaParams: ju.Map[String, 
Object])
-  extends Closeable with Logging {
+private[kafka010] object CachedKafkaProducer extends Logging {
 
   private type Producer = KafkaProducer[Array[Byte], Array[Byte]]
 
-  private val producer = createProducer()
+  private val defaultCacheExpireTimeout = TimeUnit.MINUTES.toMillis(10)
 
-  private def createProducer(): Producer = {
-val producer: Producer = new Producer(kafkaParams)
-if (log.isDebugEnabled()) {
-  val redactedParamsSeq = 
KafkaRedactionUtil.redactParams(toCacheKey(kafkaParams))
-  logDebug(s"Created a new instance of kafka producer for 
$redactedParamsSeq.")
+  private lazy val cacheExpireTimeout: Long = Option(SparkEnv.get)
+.map(_.conf.get(PRODUCER_CACHE_TIMEOUT))
+.getOrElse(defaultCacheExpireTimeout)
+
+  private val cacheLoader = new CacheLoader[Seq[(String, Object)], Producer] {
+override def load(config: Seq[(String, Object)]): Producer = {
+  createKafkaProducer(config)
 }
-producer
   }
 
-  override def close(): Unit = {
-try {
-  if (log.isInfoEnabled()) {
-val redactedParamsSeq = 
KafkaRedactionUtil.redactParams(toCacheKey(kafkaParams))
-logInfo(s"Closing the KafkaProducer with params: 
${redactedParamsSeq.mkString("\n")}.")
+  private val removalListener = new RemovalListener[Seq[(String, Object)], 
Producer]() {
+override def onRemoval(
+notification: RemovalNotification[Seq[(String, Object)], Producer]): 
Unit = {
+  val paramsSeq: Seq[(String, Object)] = notification.getKey
+  val producer: Producer = notification.getValue
+  if (log.isDebugEnabled()) {
+val redactedParamsSeq = KafkaRedactionUtil.redactParams(paramsSeq)
+logDebug(s"Evicting kafka producer $producer params: 
$redactedParamsSeq, " +
+  s"due to ${notification.getCause}")
   }
-  producer.close()
-} catch {
-  case NonFatal(e) => logWarning("Error while closing kafka producer.", e)
+  close(paramsSeq, producer)
 }
   }
 
-  def send(record: ProducerRecord[Array[Byte], Array[Byte]], callback: 
Callback): Unit = {
-producer.send(record, callback)
-  }
-

[spark] branch master updated: [SPARK-29976][CORE] Trigger speculation for stages with too few tasks

2019-12-10 Thread tgraves
This is an automated email from the ASF dual-hosted git repository.

tgraves pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
 new ad238a2  [SPARK-29976][CORE] Trigger speculation for stages with too 
few tasks
ad238a2 is described below

commit ad238a2238a9d0da89be4424574436cbfaee579d
Author: Yuchen Huo 
AuthorDate: Tue Dec 10 14:43:26 2019 -0600

[SPARK-29976][CORE] Trigger speculation for stages with too few tasks

### What changes were proposed in this pull request?
This PR add an optional spark conf for speculation to allow speculative 
runs for stages where there are only a few tasks.
```
spark.speculation.task.duration.threshold
```

If provided, tasks would be speculatively run if the TaskSet contains less 
tasks than the number of slots on a single executor and the task is taking 
longer time than the threshold.

### Why are the changes needed?
This change helps avoid scenarios where there is single executor that could 
hang forever due to disk issue and we unfortunately assigned the single task in 
a TaskSet to that executor and cause the whole job to hang forever.

### Does this PR introduce any user-facing change?
yes. If the new config `spark.speculation.task.duration.threshold` is 
provided and the TaskSet contains less tasks than the number of slots on a 
single executor and the task is taking longer time than the threshold, then 
speculative tasks would be submitted for the running tasks in the TaskSet.

### How was this patch tested?
Unit tests are added to TaskSetManagerSuite.

Closes #26614 from yuchenhuo/SPARK-29976.

Authored-by: Yuchen Huo 
Signed-off-by: Thomas Graves 
---
 .../org/apache/spark/internal/config/package.scala | 12 +++
 .../apache/spark/scheduler/TaskSetManager.scala| 60 +
 .../spark/scheduler/TaskSetManagerSuite.scala  | 98 ++
 docs/configuration.md  | 13 +++
 4 files changed, 167 insertions(+), 16 deletions(-)

diff --git a/core/src/main/scala/org/apache/spark/internal/config/package.scala 
b/core/src/main/scala/org/apache/spark/internal/config/package.scala
index 25dc4c6..9d7b31a 100644
--- a/core/src/main/scala/org/apache/spark/internal/config/package.scala
+++ b/core/src/main/scala/org/apache/spark/internal/config/package.scala
@@ -1499,6 +1499,18 @@ package object config {
   .doubleConf
   .createWithDefault(0.75)
 
+  private[spark] val SPECULATION_TASK_DURATION_THRESHOLD =
+ConfigBuilder("spark.speculation.task.duration.threshold")
+  .doc("Task duration after which scheduler would try to speculative run 
the task. If " +
+"provided, tasks would be speculatively run if current stage contains 
less tasks " +
+"than or equal to the number of slots on a single executor and the 
task is taking " +
+"longer time than the threshold. This config helps speculate stage 
with very few " +
+"tasks. Regular speculation configs may also apply if the executor 
slots are " +
+"large enough. E.g. tasks might be re-launched if there are enough 
successful runs " +
+"even though the threshold hasn't been reached.")
+  .timeConf(TimeUnit.MILLISECONDS)
+  .createOptional
+
   private[spark] val STAGING_DIR = ConfigBuilder("spark.yarn.stagingDir")
 .doc("Staging directory used while submitting applications.")
 .stringConf
diff --git 
a/core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala 
b/core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala
index 5c0bc49..e026e90 100644
--- a/core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala
+++ b/core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala
@@ -81,6 +81,13 @@ private[spark] class TaskSetManager(
   val speculationQuantile = conf.get(SPECULATION_QUANTILE)
   val speculationMultiplier = conf.get(SPECULATION_MULTIPLIER)
   val minFinishedForSpeculation = math.max((speculationQuantile * 
numTasks).floor.toInt, 1)
+  // User provided threshold for speculation regardless of whether the 
quantile has been reached
+  val speculationTaskDurationThresOpt = 
conf.get(SPECULATION_TASK_DURATION_THRESHOLD)
+  // SPARK-29976: Only when the total number of tasks in the stage is less 
than or equal to the
+  // number of slots on a single executor, would the task manager speculative 
run the tasks if
+  // their duration is longer than the given threshold. In this way, we 
wouldn't speculate too
+  // aggressively but still handle basic cases.
+  val speculationTasksLessEqToSlots = numTasks <= (conf.get(EXECUTOR_CORES) / 
sched.CPUS_PER_TASK)
 
   // For each task, tracks whether a copy of the task has succeeded. A task 
will also be
   // marked as "succeeded" if it failed with a fetch failure, in which case it 
should not
@@ -958,14 

[spark] branch master updated (8f0eb7d -> aec1d95)

2019-12-10 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 8f0eb7d  [SPARK-29587][SQL] Support SQL Standard type real as float(4) 
numeric as decimal
 add aec1d95  [SPARK-30205][PYSPARK] Import ABCs from collections.abc to 
remove deprecation warnings

No new revisions were added by this update.

Summary of changes:
 python/pyspark/resultiterable.py | 8 ++--
 1 file changed, 6 insertions(+), 2 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (24c4ce1 -> 8f0eb7d)

2019-12-10 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 24c4ce1  [SPARK-28351][SQL][FOLLOWUP] Remove 'DELETE FROM' from 
unsupportedHiveNativeCommands
 add 8f0eb7d  [SPARK-29587][SQL] Support SQL Standard type real as float(4) 
numeric as decimal

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/parser/AstBuilder.scala |  9 
 .../sql-tests/inputs/show-create-table.sql |  5 
 .../sql-tests/results/show-create-table.sql.out| 27 +-
 3 files changed, 36 insertions(+), 5 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (6103cf1 -> 24c4ce1)

2019-12-10 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 6103cf1  [SPARK-30200][SQL] Add ExplainMode for Dataset.explain
 add 24c4ce1  [SPARK-28351][SQL][FOLLOWUP] Remove 'DELETE FROM' from 
unsupportedHiveNativeCommands

No new revisions were added by this update.

Summary of changes:
 .../src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4  | 1 -
 1 file changed, 1 deletion(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (d9b3069 -> 6103cf1)

2019-12-10 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from d9b3069  [SPARK-30125][SQL] Remove PostgreSQL dialect
 add 6103cf1  [SPARK-30200][SQL] Add ExplainMode for Dataset.explain

No new revisions were added by this update.

Summary of changes:
 .../java/org/apache/spark/sql/ExplainMode.java | 64 +
 .../main/scala/org/apache/spark/sql/Dataset.scala  | 40 ++---
 .../spark/sql/execution/command/commands.scala |  7 ++-
 .../scala/org/apache/spark/sql/ExplainSuite.scala  | 67 ++
 4 files changed, 156 insertions(+), 22 deletions(-)
 create mode 100644 sql/core/src/main/java/org/apache/spark/sql/ExplainMode.java


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (a9f1809 -> d9b3069)

2019-12-10 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from a9f1809  [SPARK-30206][SQL] Rename normalizeFilters in 
DataSourceStrategy to be generic
 add d9b3069  [SPARK-30125][SQL] Remove PostgreSQL dialect

No new revisions were added by this update.

Summary of changes:
 docs/sql-keywords.md   |  4 +-
 .../spark/sql/catalyst/analysis/Analyzer.scala |  7 +-
 .../sql/catalyst/analysis/PostgreSQLDialect.scala  | 49 
 .../spark/sql/catalyst/analysis/TypeCoercion.scala | 12 +--
 .../spark/sql/catalyst/expressions/Cast.scala  | 14 ++--
 .../sql/catalyst/expressions/arithmetic.scala  |  2 +-
 .../postgreSQL/PostgreCastToBoolean.scala  | 83 
 .../spark/sql/catalyst/parser/ParseDriver.scala| 12 +--
 .../spark/sql/catalyst/util/StringUtils.scala  |  2 +-
 .../sql/catalyst/util/postgreSQL/StringUtils.scala | 33 
 .../org/apache/spark/sql/internal/SQLConf.scala| 47 +++-
 .../sql/catalyst/analysis/TypeCoercionSuite.scala  | 27 +--
 .../catalyst/encoders/ExpressionEncoderSuite.scala |  2 +-
 .../sql/catalyst/encoders/RowEncoderSuite.scala|  4 +-
 .../expressions/ArithmeticExpressionSuite.scala| 24 +++---
 .../spark/sql/catalyst/expressions/CastSuite.scala | 22 ++
 .../expressions/DecimalExpressionSuite.scala   |  4 +-
 .../sql/catalyst/expressions/ScalaUDFSuite.scala   |  4 +-
 .../expressions/postgreSQL/CastSuite.scala | 73 --
 .../catalyst/parser/ExpressionParserSuite.scala| 10 +--
 .../parser/TableIdentifierParserSuite.scala|  2 +-
 .../sql-tests/inputs/postgreSQL/boolean.sql|  1 -
 .../resources/sql-tests/results/datetime.sql.out   |  0
 .../sql-tests/results/postgreSQL/boolean.sql.out   | 66 +++-
 .../sql-tests/results/postgreSQL/case.sql.out  | 18 ++---
 .../sql-tests/results/postgreSQL/date.sql.out  | 88 +++---
 .../sql-tests/results/postgreSQL/int2.sql.out  | 24 +++---
 .../sql-tests/results/postgreSQL/int4.sql.out  | 32 
 .../sql-tests/results/postgreSQL/int8.sql.out  | 78 +--
 .../results/postgreSQL/select_implicit.sql.out | 55 --
 .../results/postgreSQL/window_part1.sql.out|  2 +-
 .../results/udf/postgreSQL/udf-case.sql.out| 18 ++---
 .../udf/postgreSQL/udf-select_implicit.sql.out | 55 --
 .../org/apache/spark/sql/DataFrameSuite.scala  |  2 +-
 .../spark/sql/PostgreSQLDialectQuerySuite.scala| 42 ---
 .../org/apache/spark/sql/SQLQueryTestSuite.scala   |  4 +-
 .../thriftserver/ThriftServerQueryTestSuite.scala  | 10 +--
 37 files changed, 286 insertions(+), 646 deletions(-)
 delete mode 100644 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/PostgreSQLDialect.scala
 delete mode 100644 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/postgreSQL/PostgreCastToBoolean.scala
 delete mode 100644 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/postgreSQL/StringUtils.scala
 delete mode 100644 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/postgreSQL/CastSuite.scala
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/datetime.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/date.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/int2.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/int4.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/int8.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/select_implicit.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/window_part1.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/udf/postgreSQL/udf-case.sql.out
 delete mode 100644 
sql/core/src/test/scala/org/apache/spark/sql/PostgreSQLDialectQuerySuite.scala


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (a9f1809 -> d9b3069)

2019-12-10 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from a9f1809  [SPARK-30206][SQL] Rename normalizeFilters in 
DataSourceStrategy to be generic
 add d9b3069  [SPARK-30125][SQL] Remove PostgreSQL dialect

No new revisions were added by this update.

Summary of changes:
 docs/sql-keywords.md   |  4 +-
 .../spark/sql/catalyst/analysis/Analyzer.scala |  7 +-
 .../sql/catalyst/analysis/PostgreSQLDialect.scala  | 49 
 .../spark/sql/catalyst/analysis/TypeCoercion.scala | 12 +--
 .../spark/sql/catalyst/expressions/Cast.scala  | 14 ++--
 .../sql/catalyst/expressions/arithmetic.scala  |  2 +-
 .../postgreSQL/PostgreCastToBoolean.scala  | 83 
 .../spark/sql/catalyst/parser/ParseDriver.scala| 12 +--
 .../spark/sql/catalyst/util/StringUtils.scala  |  2 +-
 .../sql/catalyst/util/postgreSQL/StringUtils.scala | 33 
 .../org/apache/spark/sql/internal/SQLConf.scala| 47 +++-
 .../sql/catalyst/analysis/TypeCoercionSuite.scala  | 27 +--
 .../catalyst/encoders/ExpressionEncoderSuite.scala |  2 +-
 .../sql/catalyst/encoders/RowEncoderSuite.scala|  4 +-
 .../expressions/ArithmeticExpressionSuite.scala| 24 +++---
 .../spark/sql/catalyst/expressions/CastSuite.scala | 22 ++
 .../expressions/DecimalExpressionSuite.scala   |  4 +-
 .../sql/catalyst/expressions/ScalaUDFSuite.scala   |  4 +-
 .../expressions/postgreSQL/CastSuite.scala | 73 --
 .../catalyst/parser/ExpressionParserSuite.scala| 10 +--
 .../parser/TableIdentifierParserSuite.scala|  2 +-
 .../sql-tests/inputs/postgreSQL/boolean.sql|  1 -
 .../resources/sql-tests/results/datetime.sql.out   |  0
 .../sql-tests/results/postgreSQL/boolean.sql.out   | 66 +++-
 .../sql-tests/results/postgreSQL/case.sql.out  | 18 ++---
 .../sql-tests/results/postgreSQL/date.sql.out  | 88 +++---
 .../sql-tests/results/postgreSQL/int2.sql.out  | 24 +++---
 .../sql-tests/results/postgreSQL/int4.sql.out  | 32 
 .../sql-tests/results/postgreSQL/int8.sql.out  | 78 +--
 .../results/postgreSQL/select_implicit.sql.out | 55 --
 .../results/postgreSQL/window_part1.sql.out|  2 +-
 .../results/udf/postgreSQL/udf-case.sql.out| 18 ++---
 .../udf/postgreSQL/udf-select_implicit.sql.out | 55 --
 .../org/apache/spark/sql/DataFrameSuite.scala  |  2 +-
 .../spark/sql/PostgreSQLDialectQuerySuite.scala| 42 ---
 .../org/apache/spark/sql/SQLQueryTestSuite.scala   |  4 +-
 .../thriftserver/ThriftServerQueryTestSuite.scala  | 10 +--
 37 files changed, 286 insertions(+), 646 deletions(-)
 delete mode 100644 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/PostgreSQLDialect.scala
 delete mode 100644 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/postgreSQL/PostgreCastToBoolean.scala
 delete mode 100644 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/postgreSQL/StringUtils.scala
 delete mode 100644 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/postgreSQL/CastSuite.scala
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/datetime.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/date.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/int2.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/int4.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/int8.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/select_implicit.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/postgreSQL/window_part1.sql.out
 mode change 100644 => 100755 
sql/core/src/test/resources/sql-tests/results/udf/postgreSQL/udf-case.sql.out
 delete mode 100644 
sql/core/src/test/scala/org/apache/spark/sql/PostgreSQLDialectQuerySuite.scala


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (1cac9b2 -> a9f1809)

2019-12-10 Thread dongjoon
This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from 1cac9b2  [SPARK-29967][ML][PYTHON] KMeans support instance weighting
 add a9f1809  [SPARK-30206][SQL] Rename normalizeFilters in 
DataSourceStrategy to be generic

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/execution/datasources/DataSourceStrategy.scala | 12 ++--
 .../spark/sql/execution/datasources/FileSourceStrategy.scala |  2 +-
 .../execution/datasources/PruneFileSourcePartitions.scala|  2 +-
 .../sql/execution/datasources/v2/DataSourceV2Strategy.scala  |  2 +-
 .../execution/datasources/v2/V2ScanRelationPushDown.scala|  2 +-
 .../sql/execution/datasources/DataSourceStrategySuite.scala  |  2 +-
 6 files changed, 11 insertions(+), 11 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (aa9da93 -> 1cac9b2)

2019-12-10 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from aa9da93  [SPARK-30151][SQL] Issue better error message when 
user-specified schema mismatched
 add 1cac9b2  [SPARK-29967][ML][PYTHON] KMeans support instance weighting

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/ml/clustering/KMeans.scala|  34 +++-
 .../org/apache/spark/ml/util/Instrumentation.scala |   5 +
 .../spark/mllib/clustering/DistanceMeasure.scala   |  20 +-
 .../org/apache/spark/mllib/clustering/KMeans.scala |  65 +++---
 .../apache/spark/ml/clustering/KMeansSuite.scala   | 226 -
 python/pyspark/ml/clustering.py|  27 ++-
 6 files changed, 332 insertions(+), 45 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (aa9da93 -> 1cac9b2)

2019-12-10 Thread srowen
This is an automated email from the ASF dual-hosted git repository.

srowen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from aa9da93  [SPARK-30151][SQL] Issue better error message when 
user-specified schema mismatched
 add 1cac9b2  [SPARK-29967][ML][PYTHON] KMeans support instance weighting

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/ml/clustering/KMeans.scala|  34 +++-
 .../org/apache/spark/ml/util/Instrumentation.scala |   5 +
 .../spark/mllib/clustering/DistanceMeasure.scala   |  20 +-
 .../org/apache/spark/mllib/clustering/KMeans.scala |  65 +++---
 .../apache/spark/ml/clustering/KMeansSuite.scala   | 226 -
 python/pyspark/ml/clustering.py|  27 ++-
 6 files changed, 332 insertions(+), 45 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r37163 - /dev/spark/KEYS

2019-12-10 Thread yumwang
Author: yumwang
Date: Tue Dec 10 13:00:01 2019
New Revision: 37163

Log:
Update KEYS

Modified:
dev/spark/KEYS

Modified: dev/spark/KEYS
==
--- dev/spark/KEYS (original)
+++ dev/spark/KEYS Tue Dec 10 13:00:01 2019
@@ -1107,3 +1107,63 @@ fWOysS0CT3qM2hNCa15MH+xsu6Lq56lqzCC5UhR+
 kR7loYvuYi9fxvlaW0kc3Bd10JCHCEmobHno5Gflr9I=
 =l4ux
 -END PGP PUBLIC KEY BLOCK-
+
+pub   rsa4096 2019-12-10 [SC] [expires: 2020-12-09]
+  DBD447010C1B4F7DAD3F7DFD6E1B4122F6A3A338
+uid   [ultimate] Yuming Wang 
+sig 36E1B4122F6A3A338 2019-12-10  Yuming Wang 
+sub   rsa4096 2019-12-10 [E] [expires: 2020-12-09]
+sig  6E1B4122F6A3A338 2019-12-10  Yuming Wang 
+
+-BEGIN PGP PUBLIC KEY BLOCK-
+
+mQINBF3viwgBEADtgThNIck1NzNibLbvIZqlB5EVI/S5CrmPGFA5NftCQJxPerCu
+TDpQoo147FWJkpHU6mMnnfMIT6lkm7hWWhbeQJF+HKCp8Bxdf8IcG/EygnpL47v+
+ih2CMK3CSm0vxpqoO5uZpA++BKTxLyFIx+GaiZvX26+q1b4w/kfVHiIn1a9fM8UU
+Y85D4kEBfM+eFaJLV2nrzEmULwC9sLhTKCih8AzKfdeuXiUnkJcpvuBal0PW5A0x
+tva0vH489iP5fYsr3RJKn3NFbAicbx6ylJji0/IwEcO8/Gcdr7E4Qn/4/gwIdod6
+bnav5nBz6JJ3sHvG4r8TR67qQAPA2XJiUxwaTWLWXEjmYq/boM8ufGvdYJF0JGLc
+RrYJgphsWzNhtBSAS0qkczX3Rz4Fg5ReKHXCm3U1pcjvDqm/1QE2goKicGdm3xxw
+xl4pISddCpuWQzFnm4sugXkTEQWEh4hfLmz/mR6S3cSRRiGNyXPh8Z4iXWYAdpV5
+fWCrt+o7TRUj8htYO1rjInxXgRRyjv18CajyAzXk9L/0tBE0n5pZqLK0d8lmNABE
+QFKSWnyu5iFTbYxWL3jGz1bDry9gSweWCKHM8q8TCPHLLPCn/ImtSNLUm3IoxRcs
+OCX6U9pdH+I4EUP+QJm7BNoXQhPrQV7lXKPYmq/c4JwmQXAi7i4YtuIPBQARAQAB
+tCBZdW1pbmcgV2FuZyA8eXVtd2FuZ0BhcGFjaGUub3JnPokCVAQTAQoAPhYhBNvU
+RwEMG099rT99/W4bQSL2o6M4BQJd74sIAhsDBQkB4TOABQsJCAcCBhUKCQgLAgQW
+AgMBAh4BAheAAAoJEG4bQSL2o6M472sP/0BeMWASPXPwpF1tTnvqWhhR1f/qQLVv
+iQUTFVXaQESeFTyJ9yzXpXkMJKEVVwRoOgfTEeqI2atUXR/wvSfkPHNV7ydaxIxr
+E766rrmrBbeTmpirEmK9+XYoEXbZC1ZgVZE7xIOMp5oI1VARHzLIQGqATnIW2WnR
+aEsV6Kllc+ByFOlrJnLEe8y77BeIkYL3aEkOfQHWuLeK1Ydv+dUw4jWCMxHEEC8p
+qBTsf6eHU9garB8ua8tDsjliFUXd510D9p5fNGCV/qC/QPh6CiXFFBD7O7q0F8sL
+sKyPivbHxwyR/G/sprh+k65lvQaR6CmEcILTDcuoEUjXQwR7IeH8MhmXVUKyvIGP
+rzor1Apmgrf4h6Rzn2TQ96UaoVp/jD8Vramd02U9clORAhXvlZL21FoR8xLv5v5m
+HY8MPuipxq6FYMDrdU4kERK+VQh77YIbfbQYQPzEt7vmQ5m2KsNOho3bCcK6Kgw0
+cesEzBlHp+Bu2E4mgKPjZa5GEHpzUVFMuPC4pllBIE9VCnvM8Y6jJV/+Mg2XToxs
+QqMVYqTmPodB58BIl0l2ff+C9kFc0Mq8H66UsYiUrAcTkRCKVb517prZBW7HUgRt
+gRIf/0lpIJv1Pt6HVSusM/GSspDV+9HesPrGzqAhADC90N4VDokvM1vTslQ5o6TE
+/yUYe1eLiD4nuQINBF3viwgBEADItGF/p66P1GfWTlop1N6ggU2RD68Er1KRnDo+
+6eHhPDYHtUbiIQsftcZd1BOKroUqJATgCxuUzM4n6/Qs6g/fpLsfeU8eIO0CBEky
+1sz/SuNIdgyyUIfDRuyoC983F6l/plgLqfcvJaK5fz3BzmBfxLmIxrLgUHUkb7pH
+HwXTLK5TT6DZ8FV2p5BAZvDHmvyTdReIUEk9QkJMzhXTe1irBjuBZFdyVeVgnOsr
+DuGpRxDvisDKhAKZBA83PwywbVS+aw34l0BphFzspZ3DQsIvIprKdCd5Y8N6nnvb
+PC49Ev4ypHBFw5aMgWo4LGS7qUk12ljuJUV0US5RCAEI8jh8MQbZHge17MHsKdJi
+YsbRkW1bKJDb+k3vkucJSX7Q3iGzMzUY+xhI+TpjBCToz/g1mdKd3OYfw074F8Bc
+K295lj9qeYqHCUG5Y4PAg9NRYfqgZLI37mCGK9+uo8FsTlop3rrUU8WMbQUY+yYH
+Tuvn276+920HahGUyWn9ATCvLTxL9q/Q17KbPnChcJ3CE1gopfXmoOxyhw/tjdDK
+ISc6CLFvjaAFzUAT4eS6+FrXGP2XldjDDfcVtxhNYc+Jmkb1u7LDpQeUDvfB7n8r
+LH04EnH5VKw1QqP6YS056heixsWGznvbXT/MZFFAH2ZsRhX1I/KeP5t+gBkhnkUH
+OvXmwQARAQABiQI8BBgBCgAmFiEE29RHAQwbT32tP339bhtBIvajozgFAl3viwgC
+GwwFCQHhM4AACgkQbhtBIvajoziephAArLP4FcLuijAKMnrJdSko+jImsOa/Iuyb
+ei1Ta63n18keydqf2sbfLzPz2eypk41kAZeRRno1rL8AqiU/JDvSSNkm0mq+Zxik
+hYQamw+YqQl4w9AqeDnNKgNb8YErMBKf9frtkMuSV4C4mUyAZY6n15OAr6wcsAsp
+HdnwOtvbNq0zoaPrvGJPIhb6YUC8TMiaE3ALbNcnMK8gSytbZf0ov4HJZRtgGujT
+ShdkUTaM5m1ZUIf4jnjL20AomlzSePYBw1gMGgtr+O9SzmJrQTkd15Qg8i16rd6H
+hx7j6oOcKQLhzVYE+rqOjcaGupvxMnSyDyPGdkxCG/RTSR1uXzfnUT04BqZD3KQj
+WMdcesSwS9itCcZfEJrTxxEJ9IX7vYaiwXhAneGc1Bv1Mupy+RuMujGNnWkEWG8b
+s/c8UkWhYTkyWnO+gVqqlzV4+6NXYJ9GQyqADmiT831spKyDx+au9FtNMufWYFJE
+coQLhF8xp5GrQfXFiHC2lrtV/6/3wAR55nXw8Dagh+OANDZ8Yk1mCCNXSal6MN25
+rMA+YcuC9o2K7dKjVv3KinQ2Tiv4TVxyTjcyZurgi4W3D6WM++w+xvSeB8BoCRnD
+8ENQ+33/HFsAaMH5SVQCim4ZpjaqiblklGMweH4zxSpsRb3qFzhknVIithDB57fc
+0TbepIdiQlc=
+=wdlY
+-END PGP PUBLIC KEY BLOCK-



-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[spark] branch master updated (be867e8 -> aa9da93)

2019-12-10 Thread wenchen
This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git.


from be867e8  [SPARK-30196][BUILD] Bump lz4-java version to 1.7.0
 add aa9da93  [SPARK-30151][SQL] Issue better error message when 
user-specified schema mismatched

No new revisions were added by this update.

Summary of changes:
 .../org/apache/spark/sql/execution/datasources/DataSource.scala| 7 ++-
 .../test/scala/org/apache/spark/sql/sources/TableScanSuite.scala   | 5 +++--
 .../org/apache/spark/sql/test/DataFrameReaderWriterSuite.scala | 5 ++---
 3 files changed, 11 insertions(+), 6 deletions(-)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org