svn commit: r23752 - in /dev/spark/2.3.0-SNAPSHOT-2017_12_15_20_01-0c8fca4-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2017-12-15 Thread pwendell
Author: pwendell
Date: Sat Dec 16 04:14:41 2017
New Revision: 23752

Log:
Apache Spark 2.3.0-SNAPSHOT-2017_12_15_20_01-0c8fca4 docs


[This commit notification would consist of 1414 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-22811][PYSPARK][ML] Fix pyspark.ml.tests failure when Hive is not available.

2017-12-15 Thread gurwls223
Repository: spark
Updated Branches:
  refs/heads/master 46776234a -> 0c8fca460


[SPARK-22811][PYSPARK][ML] Fix pyspark.ml.tests failure when Hive is not 
available.

## What changes were proposed in this pull request?

pyspark.ml.tests is missing a py4j import. I've added the import and fixed the 
test that uses it. This test was only failing when testing without Hive.

## How was this patch tested?

Existing tests.

Please review http://spark.apache.org/contributing.html before opening a pull 
request.

Author: Bago Amirbekian 

Closes #19997 from MrBago/fix-ImageReaderTest2.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0c8fca46
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0c8fca46
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0c8fca46

Branch: refs/heads/master
Commit: 0c8fca4608643ed9e1eb3ae8620e6f4f6a017a87
Parents: 4677623
Author: Bago Amirbekian 
Authored: Sat Dec 16 10:57:35 2017 +0900
Committer: hyukjinkwon 
Committed: Sat Dec 16 10:57:35 2017 +0900

--
 python/pyspark/ml/tests.py | 10 +++---
 1 file changed, 7 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/0c8fca46/python/pyspark/ml/tests.py
--
diff --git a/python/pyspark/ml/tests.py b/python/pyspark/ml/tests.py
index 3a0b816..be15211 100755
--- a/python/pyspark/ml/tests.py
+++ b/python/pyspark/ml/tests.py
@@ -44,6 +44,7 @@ import array as pyarray
 import numpy as np
 from numpy import abs, all, arange, array, array_equal, inf, ones, tile, zeros
 import inspect
+import py4j
 
 from pyspark import keyword_only, SparkContext
 from pyspark.ml import Estimator, Model, Pipeline, PipelineModel, Transformer, 
UnaryTransformer
@@ -1859,8 +1860,9 @@ class ImageReaderTest2(PySparkTestCase):
 
 @classmethod
 def setUpClass(cls):
-PySparkTestCase.setUpClass()
+super(ImageReaderTest2, cls).setUpClass()
 # Note that here we enable Hive's support.
+cls.spark = None
 try:
 cls.sc._jvm.org.apache.hadoop.hive.conf.HiveConf()
 except py4j.protocol.Py4JError:
@@ -1873,8 +1875,10 @@ class ImageReaderTest2(PySparkTestCase):
 
 @classmethod
 def tearDownClass(cls):
-PySparkTestCase.tearDownClass()
-cls.spark.sparkSession.stop()
+super(ImageReaderTest2, cls).tearDownClass()
+if cls.spark is not None:
+cls.spark.sparkSession.stop()
+cls.spark = None
 
 def test_read_images_multiple_times(self):
 # This test case is to check if `ImageSchema.readImages` tries to


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



svn commit: r23747 - in /dev/spark/2.3.0-SNAPSHOT-2017_12_15_12_01-4677623-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2017-12-15 Thread pwendell
Author: pwendell
Date: Fri Dec 15 20:14:43 2017
New Revision: 23747

Log:
Apache Spark 2.3.0-SNAPSHOT-2017_12_15_12_01-4677623 docs


[This commit notification would consist of 1414 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[2/2] spark git commit: [SPARK-22762][TEST] Basic tests for IfCoercion and CaseWhenCoercion

2017-12-15 Thread lixiao
[SPARK-22762][TEST] Basic tests for IfCoercion and CaseWhenCoercion

## What changes were proposed in this pull request?

Basic tests for IfCoercion and CaseWhenCoercion

## How was this patch tested?

N/A

Author: Yuming Wang 

Closes #19949 from wangyum/SPARK-22762.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/46776234
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/46776234
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/46776234

Branch: refs/heads/master
Commit: 46776234a49742e94c64897322500582d7393d35
Parents: 9fafa82
Author: Yuming Wang 
Authored: Fri Dec 15 09:58:31 2017 -0800
Committer: gatorsmile 
Committed: Fri Dec 15 09:58:31 2017 -0800

--
 .../typeCoercion/native/caseWhenCoercion.sql|  174 +++
 .../inputs/typeCoercion/native/ifCoercion.sql   |  174 +++
 .../native/caseWhenCoercion.sql.out | 1232 ++
 .../typeCoercion/native/ifCoercion.sql.out  | 1232 ++
 4 files changed, 2812 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/46776234/sql/core/src/test/resources/sql-tests/inputs/typeCoercion/native/caseWhenCoercion.sql
--
diff --git 
a/sql/core/src/test/resources/sql-tests/inputs/typeCoercion/native/caseWhenCoercion.sql
 
b/sql/core/src/test/resources/sql-tests/inputs/typeCoercion/native/caseWhenCoercion.sql
new file mode 100644
index 000..a780529
--- /dev/null
+++ 
b/sql/core/src/test/resources/sql-tests/inputs/typeCoercion/native/caseWhenCoercion.sql
@@ -0,0 +1,174 @@
+--
+--   Licensed to the Apache Software Foundation (ASF) under one or more
+--   contributor license agreements.  See the NOTICE file distributed with
+--   this work for additional information regarding copyright ownership.
+--   The ASF licenses this file to You under the Apache License, Version 2.0
+--   (the "License"); you may not use this file except in compliance with
+--   the License.  You may obtain a copy of the License at
+--
+--  http://www.apache.org/licenses/LICENSE-2.0
+--
+--   Unless required by applicable law or agreed to in writing, software
+--   distributed under the License is distributed on an "AS IS" BASIS,
+--   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+--   See the License for the specific language governing permissions and
+--   limitations under the License.
+--
+
+CREATE TEMPORARY VIEW t AS SELECT 1;
+
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as tinyint) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as smallint) END 
FROM t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as int) END FROM t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as bigint) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as float) END FROM t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as double) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as decimal(10, 0)) 
END FROM t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as string) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast('2' as binary) END 
FROM t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast(2 as boolean) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast('2017-12-11 
09:30:00.0' as timestamp) END FROM t;
+SELECT CASE WHEN true THEN cast(1 as tinyint) ELSE cast('2017-12-11 09:30:00' 
as date) END FROM t;
+
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as tinyint) END 
FROM t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as smallint) END 
FROM t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as int) END FROM t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as bigint) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as float) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as double) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as decimal(10, 0)) 
END FROM t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as string) END FROM 
t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast('2' as binary) END 
FROM t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast(2 as boolean) END 
FROM t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast('2017-12-11 
09:30:00.0' as timestamp) END FROM t;
+SELECT CASE WHEN true THEN cast(1 as smallint) ELSE cast('2017-12-11 09:30:00' 
as date) END FROM t;
+
+SELECT CASE WHEN true THEN cast(1 as int) ELSE cast(2 as tinyint) END FROM t;
+SELECT CASE WHEN true THEN cast(1 as int) ELSE cast(2 as smallint) END FROM t;
+SELECT CASE 

[1/2] spark git commit: [SPARK-22762][TEST] Basic tests for IfCoercion and CaseWhenCoercion

2017-12-15 Thread lixiao
Repository: spark
Updated Branches:
  refs/heads/master 9fafa8209 -> 46776234a


http://git-wip-us.apache.org/repos/asf/spark/blob/46776234/sql/core/src/test/resources/sql-tests/results/typeCoercion/native/ifCoercion.sql.out
--
diff --git 
a/sql/core/src/test/resources/sql-tests/results/typeCoercion/native/ifCoercion.sql.out
 
b/sql/core/src/test/resources/sql-tests/results/typeCoercion/native/ifCoercion.sql.out
new file mode 100644
index 000..7097027
--- /dev/null
+++ 
b/sql/core/src/test/resources/sql-tests/results/typeCoercion/native/ifCoercion.sql.out
@@ -0,0 +1,1232 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 145
+
+
+-- !query 0
+CREATE TEMPORARY VIEW t AS SELECT 1
+-- !query 0 schema
+struct<>
+-- !query 0 output
+
+
+
+-- !query 1
+SELECT IF(true, cast(1 as tinyint), cast(2 as tinyint)) FROM t
+-- !query 1 schema
+struct<(IF(true, CAST(1 AS TINYINT), CAST(2 AS TINYINT))):tinyint>
+-- !query 1 output
+1
+
+
+-- !query 2
+SELECT IF(true, cast(1 as tinyint), cast(2 as smallint)) FROM t
+-- !query 2 schema
+struct<(IF(true, CAST(CAST(1 AS TINYINT) AS SMALLINT), CAST(2 AS 
SMALLINT))):smallint>
+-- !query 2 output
+1
+
+
+-- !query 3
+SELECT IF(true, cast(1 as tinyint), cast(2 as int)) FROM t
+-- !query 3 schema
+struct<(IF(true, CAST(CAST(1 AS TINYINT) AS INT), CAST(2 AS INT))):int>
+-- !query 3 output
+1
+
+
+-- !query 4
+SELECT IF(true, cast(1 as tinyint), cast(2 as bigint)) FROM t
+-- !query 4 schema
+struct<(IF(true, CAST(CAST(1 AS TINYINT) AS BIGINT), CAST(2 AS 
BIGINT))):bigint>
+-- !query 4 output
+1
+
+
+-- !query 5
+SELECT IF(true, cast(1 as tinyint), cast(2 as float)) FROM t
+-- !query 5 schema
+struct<(IF(true, CAST(CAST(1 AS TINYINT) AS FLOAT), CAST(2 AS FLOAT))):float>
+-- !query 5 output
+1.0
+
+
+-- !query 6
+SELECT IF(true, cast(1 as tinyint), cast(2 as double)) FROM t
+-- !query 6 schema
+struct<(IF(true, CAST(CAST(1 AS TINYINT) AS DOUBLE), CAST(2 AS 
DOUBLE))):double>
+-- !query 6 output
+1.0
+
+
+-- !query 7
+SELECT IF(true, cast(1 as tinyint), cast(2 as decimal(10, 0))) FROM t
+-- !query 7 schema
+struct<(IF(true, CAST(CAST(1 AS TINYINT) AS DECIMAL(10,0)), CAST(2 AS 
DECIMAL(10,0:decimal(10,0)>
+-- !query 7 output
+1
+
+
+-- !query 8
+SELECT IF(true, cast(1 as tinyint), cast(2 as string)) FROM t
+-- !query 8 schema
+struct<(IF(true, CAST(CAST(1 AS TINYINT) AS STRING), CAST(2 AS 
STRING))):string>
+-- !query 8 output
+1
+
+
+-- !query 9
+SELECT IF(true, cast(1 as tinyint), cast('2' as binary)) FROM t
+-- !query 9 schema
+struct<>
+-- !query 9 output
+org.apache.spark.sql.AnalysisException
+cannot resolve '(IF(true, CAST(1 AS TINYINT), CAST('2' AS BINARY)))' due to 
data type mismatch: differing types in '(IF(true, CAST(1 AS TINYINT), CAST('2' 
AS BINARY)))' (tinyint and binary).; line 1 pos 7
+
+
+-- !query 10
+SELECT IF(true, cast(1 as tinyint), cast(2 as boolean)) FROM t
+-- !query 10 schema
+struct<>
+-- !query 10 output
+org.apache.spark.sql.AnalysisException
+cannot resolve '(IF(true, CAST(1 AS TINYINT), CAST(2 AS BOOLEAN)))' due to 
data type mismatch: differing types in '(IF(true, CAST(1 AS TINYINT), CAST(2 AS 
BOOLEAN)))' (tinyint and boolean).; line 1 pos 7
+
+
+-- !query 11
+SELECT IF(true, cast(1 as tinyint), cast('2017-12-11 09:30:00.0' as 
timestamp)) FROM t
+-- !query 11 schema
+struct<>
+-- !query 11 output
+org.apache.spark.sql.AnalysisException
+cannot resolve '(IF(true, CAST(1 AS TINYINT), CAST('2017-12-11 09:30:00.0' AS 
TIMESTAMP)))' due to data type mismatch: differing types in '(IF(true, CAST(1 
AS TINYINT), CAST('2017-12-11 09:30:00.0' AS TIMESTAMP)))' (tinyint and 
timestamp).; line 1 pos 7
+
+
+-- !query 12
+SELECT IF(true, cast(1 as tinyint), cast('2017-12-11 09:30:00' as date)) FROM t
+-- !query 12 schema
+struct<>
+-- !query 12 output
+org.apache.spark.sql.AnalysisException
+cannot resolve '(IF(true, CAST(1 AS TINYINT), CAST('2017-12-11 09:30:00' AS 
DATE)))' due to data type mismatch: differing types in '(IF(true, CAST(1 AS 
TINYINT), CAST('2017-12-11 09:30:00' AS DATE)))' (tinyint and date).; line 1 
pos 7
+
+
+-- !query 13
+SELECT IF(true, cast(1 as smallint), cast(2 as tinyint)) FROM t
+-- !query 13 schema
+struct<(IF(true, CAST(1 AS SMALLINT), CAST(CAST(2 AS TINYINT) AS 
SMALLINT))):smallint>
+-- !query 13 output
+1
+
+
+-- !query 14
+SELECT IF(true, cast(1 as smallint), cast(2 as smallint)) FROM t
+-- !query 14 schema
+struct<(IF(true, CAST(1 AS SMALLINT), CAST(2 AS SMALLINT))):smallint>
+-- !query 14 output
+1
+
+
+-- !query 15
+SELECT IF(true, cast(1 as smallint), cast(2 as int)) FROM t
+-- !query 15 schema
+struct<(IF(true, CAST(CAST(1 AS SMALLINT) AS INT), CAST(2 AS INT))):int>
+-- !query 15 output
+1
+
+
+-- !query 16
+SELECT IF(true, cast(1 as smallint), cast(2 as bigint)) FROM t
+-- !query 16 schema
+struct<(IF(true, CAST(CAST(1 AS SMALLINT) AS BIGINT), CAST(2 AS 
BIGINT))):bigint>
+-- !query 16 output
+1
+
+
+-- !query 17
+SELECT 

spark git commit: [SPARK-22800][TEST][SQL] Add a SSB query suite

2017-12-15 Thread lixiao
Repository: spark
Updated Branches:
  refs/heads/master e58f27567 -> 9fafa8209


[SPARK-22800][TEST][SQL] Add a SSB query suite

## What changes were proposed in this pull request?
Add a test suite to ensure all the [SSB (Star Schema 
Benchmark)](https://www.cs.umb.edu/~poneil/StarSchemaB.PDF) queries can be 
successfully analyzed, optimized and compiled without hitting the max iteration 
threshold.

## How was this patch tested?
Added `SSBQuerySuite`.

Author: Takeshi Yamamuro 

Closes #19990 from maropu/SPARK-22800.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/9fafa820
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/9fafa820
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/9fafa820

Branch: refs/heads/master
Commit: 9fafa8209c51adc2a22b89aedf9af7b5e29e0059
Parents: e58f275
Author: Takeshi Yamamuro 
Authored: Fri Dec 15 09:56:22 2017 -0800
Committer: gatorsmile 
Committed: Fri Dec 15 09:56:22 2017 -0800

--
 sql/core/src/test/resources/ssb/1.1.sql |  6 ++
 sql/core/src/test/resources/ssb/1.2.sql |  6 ++
 sql/core/src/test/resources/ssb/1.3.sql |  6 ++
 sql/core/src/test/resources/ssb/2.1.sql |  9 ++
 sql/core/src/test/resources/ssb/2.2.sql |  9 ++
 sql/core/src/test/resources/ssb/2.3.sql |  9 ++
 sql/core/src/test/resources/ssb/3.1.sql | 10 +++
 sql/core/src/test/resources/ssb/3.2.sql | 10 +++
 sql/core/src/test/resources/ssb/3.3.sql | 12 +++
 sql/core/src/test/resources/ssb/3.4.sql | 12 +++
 sql/core/src/test/resources/ssb/4.1.sql | 11 +++
 sql/core/src/test/resources/ssb/4.2.sql | 12 +++
 sql/core/src/test/resources/ssb/4.3.sql | 12 +++
 .../org/apache/spark/sql/SSBQuerySuite.scala| 87 
 .../org/apache/spark/sql/TPCHQuerySuite.scala   |  2 -
 15 files changed, 211 insertions(+), 2 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/9fafa820/sql/core/src/test/resources/ssb/1.1.sql
--
diff --git a/sql/core/src/test/resources/ssb/1.1.sql 
b/sql/core/src/test/resources/ssb/1.1.sql
new file mode 100644
index 000..62da302
--- /dev/null
+++ b/sql/core/src/test/resources/ssb/1.1.sql
@@ -0,0 +1,6 @@
+select sum(lo_extendedprice*lo_discount) as revenue
+   from lineorder, date
+   where lo_orderdate = d_datekey
+   and d_year = 1993
+   and lo_discount between 1 and 3
+   and lo_quantity < 25

http://git-wip-us.apache.org/repos/asf/spark/blob/9fafa820/sql/core/src/test/resources/ssb/1.2.sql
--
diff --git a/sql/core/src/test/resources/ssb/1.2.sql 
b/sql/core/src/test/resources/ssb/1.2.sql
new file mode 100644
index 000..1657bfd
--- /dev/null
+++ b/sql/core/src/test/resources/ssb/1.2.sql
@@ -0,0 +1,6 @@
+select sum(lo_extendedprice*lo_discount) as revenue
+   from lineorder, date
+   where lo_orderdate = d_datekey
+   and d_yearmonthnum = 199401
+   and lo_discount between 4 and 6
+   and lo_quantity between 26 and 35

http://git-wip-us.apache.org/repos/asf/spark/blob/9fafa820/sql/core/src/test/resources/ssb/1.3.sql
--
diff --git a/sql/core/src/test/resources/ssb/1.3.sql 
b/sql/core/src/test/resources/ssb/1.3.sql
new file mode 100644
index 000..e9bbf51
--- /dev/null
+++ b/sql/core/src/test/resources/ssb/1.3.sql
@@ -0,0 +1,6 @@
+select sum(lo_extendedprice*lo_discount) as revenue
+   from lineorder, date
+   where lo_orderdate = d_datekey
+   and d_weeknuminyear = 6 and d_year = 1994
+   and lo_discount between 5 and 7
+   and lo_quantity between 36 and 40

http://git-wip-us.apache.org/repos/asf/spark/blob/9fafa820/sql/core/src/test/resources/ssb/2.1.sql
--
diff --git a/sql/core/src/test/resources/ssb/2.1.sql 
b/sql/core/src/test/resources/ssb/2.1.sql
new file mode 100644
index 000..00d4027
--- /dev/null
+++ b/sql/core/src/test/resources/ssb/2.1.sql
@@ -0,0 +1,9 @@
+select sum(lo_revenue), d_year, p_brand1
+   from lineorder, date, part, supplier
+   where lo_orderdate = d_datekey
+   and lo_partkey = p_partkey
+   and lo_suppkey = s_suppkey
+   and p_category = 'MFGR#12'
+   and s_region = 'AMERICA'
+   group by d_year, p_brand1
+   order by d_year, p_brand1

http://git-wip-us.apache.org/repos/asf/spark/blob/9fafa820/sql/core/src/test/resources/ssb/2.2.sql

[spark] Git Push Summary

2017-12-15 Thread lixiao
Repository: spark
Updated Branches:
  refs/heads/revert19961 [deleted] e58f27567

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: Revert "[SPARK-22496][SQL] thrift server adds operation logs"

2017-12-15 Thread lixiao
Repository: spark
Updated Branches:
  refs/heads/master 3775dd31e -> e58f27567


Revert "[SPARK-22496][SQL] thrift server adds operation logs"

This reverts commit 0ea2d8c12e49e30df6bbfa57d74134b25f96a196.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e58f2756
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/e58f2756
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/e58f2756

Branch: refs/heads/master
Commit: e58f275678fb4f904124a4a2a1762f04c835eb0e
Parents: 3775dd3
Author: gatorsmile 
Authored: Fri Dec 15 09:46:15 2017 -0800
Committer: gatorsmile 
Committed: Fri Dec 15 09:46:15 2017 -0800

--
 .../cli/operation/ExecuteStatementOperation.java   | 13 -
 .../hive/service/cli/operation/SQLOperation.java   | 12 
 .../thriftserver/SparkExecuteStatementOperation.scala  |  1 -
 3 files changed, 12 insertions(+), 14 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/e58f2756/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
--
diff --git 
a/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
 
b/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
index 6740d3b..3f2de10 100644
--- 
a/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
+++ 
b/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
@@ -23,7 +23,6 @@ import java.util.Map;
 
 import org.apache.hadoop.hive.ql.processors.CommandProcessor;
 import org.apache.hadoop.hive.ql.processors.CommandProcessorFactory;
-import org.apache.hadoop.hive.ql.session.OperationLog;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.hive.service.cli.OperationType;
 import org.apache.hive.service.cli.session.HiveSession;
@@ -68,16 +67,4 @@ public abstract class ExecuteStatementOperation extends 
Operation {
   this.confOverlay = confOverlay;
 }
   }
-
-  protected void registerCurrentOperationLog() {
-if (isOperationLogEnabled) {
-  if (operationLog == null) {
-LOG.warn("Failed to get current OperationLog object of Operation: " +
-  getHandle().getHandleIdentifier());
-isOperationLogEnabled = false;
-return;
-  }
-  OperationLog.setCurrentOperationLog(operationLog);
-}
-  }
 }

http://git-wip-us.apache.org/repos/asf/spark/blob/e58f2756/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
--
diff --git 
a/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
 
b/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
index fd9108e..5014ced 100644
--- 
a/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
+++ 
b/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
@@ -274,6 +274,18 @@ public class SQLOperation extends 
ExecuteStatementOperation {
 }
   }
 
+  private void registerCurrentOperationLog() {
+if (isOperationLogEnabled) {
+  if (operationLog == null) {
+LOG.warn("Failed to get current OperationLog object of Operation: " +
+getHandle().getHandleIdentifier());
+isOperationLogEnabled = false;
+return;
+  }
+  OperationLog.setCurrentOperationLog(operationLog);
+}
+  }
+
   private void cleanup(OperationState state) throws HiveSQLException {
 setState(state);
 if (shouldRunAsync()) {

http://git-wip-us.apache.org/repos/asf/spark/blob/e58f2756/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
--
diff --git 
a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
 
b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
index 664bc20..f5191fa 100644
--- 
a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
+++ 
b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
@@ -170,7 +170,6 @@ private[hive] class SparkExecuteStatementOperation(
 override def run(): Unit = {
   val doAsAction = new PrivilegedExceptionAction[Unit]() {
 override def run(): 

spark git commit: Revert "[SPARK-22496][SQL] thrift server adds operation logs"

2017-12-15 Thread lixiao
Repository: spark
Updated Branches:
  refs/heads/revert19961 [created] e58f27567


Revert "[SPARK-22496][SQL] thrift server adds operation logs"

This reverts commit 0ea2d8c12e49e30df6bbfa57d74134b25f96a196.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e58f2756
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/e58f2756
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/e58f2756

Branch: refs/heads/revert19961
Commit: e58f275678fb4f904124a4a2a1762f04c835eb0e
Parents: 3775dd3
Author: gatorsmile 
Authored: Fri Dec 15 09:46:15 2017 -0800
Committer: gatorsmile 
Committed: Fri Dec 15 09:46:15 2017 -0800

--
 .../cli/operation/ExecuteStatementOperation.java   | 13 -
 .../hive/service/cli/operation/SQLOperation.java   | 12 
 .../thriftserver/SparkExecuteStatementOperation.scala  |  1 -
 3 files changed, 12 insertions(+), 14 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/e58f2756/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
--
diff --git 
a/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
 
b/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
index 6740d3b..3f2de10 100644
--- 
a/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
+++ 
b/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/ExecuteStatementOperation.java
@@ -23,7 +23,6 @@ import java.util.Map;
 
 import org.apache.hadoop.hive.ql.processors.CommandProcessor;
 import org.apache.hadoop.hive.ql.processors.CommandProcessorFactory;
-import org.apache.hadoop.hive.ql.session.OperationLog;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.hive.service.cli.OperationType;
 import org.apache.hive.service.cli.session.HiveSession;
@@ -68,16 +67,4 @@ public abstract class ExecuteStatementOperation extends 
Operation {
   this.confOverlay = confOverlay;
 }
   }
-
-  protected void registerCurrentOperationLog() {
-if (isOperationLogEnabled) {
-  if (operationLog == null) {
-LOG.warn("Failed to get current OperationLog object of Operation: " +
-  getHandle().getHandleIdentifier());
-isOperationLogEnabled = false;
-return;
-  }
-  OperationLog.setCurrentOperationLog(operationLog);
-}
-  }
 }

http://git-wip-us.apache.org/repos/asf/spark/blob/e58f2756/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
--
diff --git 
a/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
 
b/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
index fd9108e..5014ced 100644
--- 
a/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
+++ 
b/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/SQLOperation.java
@@ -274,6 +274,18 @@ public class SQLOperation extends 
ExecuteStatementOperation {
 }
   }
 
+  private void registerCurrentOperationLog() {
+if (isOperationLogEnabled) {
+  if (operationLog == null) {
+LOG.warn("Failed to get current OperationLog object of Operation: " +
+getHandle().getHandleIdentifier());
+isOperationLogEnabled = false;
+return;
+  }
+  OperationLog.setCurrentOperationLog(operationLog);
+}
+  }
+
   private void cleanup(OperationState state) throws HiveSQLException {
 setState(state);
 if (shouldRunAsync()) {

http://git-wip-us.apache.org/repos/asf/spark/blob/e58f2756/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
--
diff --git 
a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
 
b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
index 664bc20..f5191fa 100644
--- 
a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
+++ 
b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
@@ -170,7 +170,6 @@ private[hive] class SparkExecuteStatementOperation(
 override def run(): Unit = {
   val doAsAction = new PrivilegedExceptionAction[Unit]() {
 override def 

svn commit: r23743 - in /dev/spark/2.3.0-SNAPSHOT-2017_12_15_00_01-3775dd3-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2017-12-15 Thread pwendell
Author: pwendell
Date: Fri Dec 15 08:15:09 2017
New Revision: 23743

Log:
Apache Spark 2.3.0-SNAPSHOT-2017_12_15_00_01-3775dd3 docs


[This commit notification would consist of 1414 parts, 
which exceeds the limit of 50 ones, so it was shortened to the summary.]

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org