[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2018-12-27 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r244273067
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkGetSchemasOperation.scala
 ##
 @@ -0,0 +1,100 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.thriftserver
+
+import java.util.UUID
+
+import 
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveOperationType
+import org.apache.hive.service.cli._
+import org.apache.hive.service.cli.operation.GetSchemasOperation
+import org.apache.hive.service.cli.session.HiveSession
+
+import org.apache.spark.internal.Logging
+import org.apache.spark.sql.SQLContext
+import org.apache.spark.sql.catalyst.catalog.SessionCatalog
+
+/**
+ * Spark's own GetSchemasOperation
+ *
+ * @param sqlContext SQLContext to use
+ * @param parentSession a HiveSession from SessionManager
+ * @param catalogName catalog name. null if not applicable.
+ * @param schemaName database name, null or a concrete database name
+ */
+private[hive] class SparkGetSchemasOperation(
+sqlContext: SQLContext,
+parentSession: HiveSession,
+catalogName: String,
+schemaName: String)
+  extends GetSchemasOperation(parentSession, catalogName, schemaName) with 
Logging {
+
+  val catalog: SessionCatalog = sqlContext.sessionState.catalog
+
+  private final val RESULT_SET_SCHEMA = new TableSchema()
+.addStringColumn("TABLE_SCHEM", "Schema name.")
+.addStringColumn("TABLE_CATALOG", "Catalog name.")
+
+  private val rowSet = RowSetFactory.create(RESULT_SET_SCHEMA, 
getProtocolVersion)
+
+  private var statementId: String = _
+
+  override def close(): Unit = {
+logInfo(s"Close get schemas with $statementId")
+setState(OperationState.CLOSED)
+  }
+
+  override def runInternal(): Unit = {
+statementId = UUID.randomUUID().toString
+logInfo(s"Getting schemas with $statementId")
+setState(OperationState.RUNNING)
+// Always use the latest class loader provided by executionHive's state.
+val executionHiveClassLoader = sqlContext.sharedState.jarClassLoader
+Thread.currentThread().setContextClassLoader(executionHiveClassLoader)
+
+if (isAuthV2Enabled) {
+  val cmdStr = s"catalog : $catalogName, schemaPattern : $schemaName"
+  authorizeMetaGets(HiveOperationType.GET_TABLES, null, cmdStr)
+}
+
+try {
+  catalog.listDatabases(convertSchemaPattern(schemaName)).foreach { dbName 
=>
+rowSet.addRow(Array[AnyRef](dbName, ""))
+  }
+  setState(OperationState.FINISHED)
+} catch {
+  case e: HiveSQLException =>
+setState(OperationState.ERROR)
+throw e
+}
+  }
+
+  override def getNextRowSet(order: FetchOrientation, maxRows: Long): RowSet = 
{
+validateDefaultFetchOrientation(order)
+assertState(OperationState.FINISHED)
 
 Review comment:
   What is the reason you need to change the order between line 87 and 88?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2018-12-27 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r244273331
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkGetSchemasOperation.scala
 ##
 @@ -0,0 +1,100 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.thriftserver
+
+import java.util.UUID
+
+import 
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveOperationType
+import org.apache.hive.service.cli._
+import org.apache.hive.service.cli.operation.GetSchemasOperation
+import org.apache.hive.service.cli.session.HiveSession
+
+import org.apache.spark.internal.Logging
+import org.apache.spark.sql.SQLContext
+import org.apache.spark.sql.catalyst.catalog.SessionCatalog
+
+/**
+ * Spark's own GetSchemasOperation
+ *
+ * @param sqlContext SQLContext to use
+ * @param parentSession a HiveSession from SessionManager
+ * @param catalogName catalog name. null if not applicable.
+ * @param schemaName database name, null or a concrete database name
+ */
+private[hive] class SparkGetSchemasOperation(
+sqlContext: SQLContext,
+parentSession: HiveSession,
+catalogName: String,
+schemaName: String)
+  extends GetSchemasOperation(parentSession, catalogName, schemaName) with 
Logging {
+
+  val catalog: SessionCatalog = sqlContext.sessionState.catalog
+
+  private final val RESULT_SET_SCHEMA = new TableSchema()
+.addStringColumn("TABLE_SCHEM", "Schema name.")
+.addStringColumn("TABLE_CATALOG", "Catalog name.")
+
+  private val rowSet = RowSetFactory.create(RESULT_SET_SCHEMA, 
getProtocolVersion)
+
+  private var statementId: String = _
+
+  override def close(): Unit = {
+logInfo(s"Close get schemas with $statementId")
+setState(OperationState.CLOSED)
+  }
+
+  override def runInternal(): Unit = {
+statementId = UUID.randomUUID().toString
+logInfo(s"Getting schemas with $statementId")
+setState(OperationState.RUNNING)
+// Always use the latest class loader provided by executionHive's state.
+val executionHiveClassLoader = sqlContext.sharedState.jarClassLoader
+Thread.currentThread().setContextClassLoader(executionHiveClassLoader)
+
+if (isAuthV2Enabled) {
+  val cmdStr = s"catalog : $catalogName, schemaPattern : $schemaName"
+  authorizeMetaGets(HiveOperationType.GET_TABLES, null, cmdStr)
+}
+
+try {
+  catalog.listDatabases(convertSchemaPattern(schemaName)).foreach { dbName 
=>
+rowSet.addRow(Array[AnyRef](dbName, ""))
+  }
+  setState(OperationState.FINISHED)
+} catch {
+  case e: HiveSQLException =>
+setState(OperationState.ERROR)
+throw e
+}
+  }
+
+  override def getNextRowSet(order: FetchOrientation, maxRows: Long): RowSet = 
{
+validateDefaultFetchOrientation(order)
+assertState(OperationState.FINISHED)
+setHasResultSet(true)
+if (order.equals(FetchOrientation.FETCH_FIRST)) {
+  rowSet.setStartOffset(0)
+}
+rowSet.extractSubset(maxRows.toInt)
+  }
+
+  override def cancel(): Unit = {
+logInfo(s"Cancel get schemas with $statementId")
+setState(OperationState.CANCELED)
 
 Review comment:
   Why not calling the default `cancel()`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2018-12-27 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r244273423
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkGetSchemasOperation.scala
 ##
 @@ -0,0 +1,100 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.thriftserver
+
+import java.util.UUID
+
+import 
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveOperationType
+import org.apache.hive.service.cli._
+import org.apache.hive.service.cli.operation.GetSchemasOperation
+import org.apache.hive.service.cli.session.HiveSession
+
+import org.apache.spark.internal.Logging
+import org.apache.spark.sql.SQLContext
+import org.apache.spark.sql.catalyst.catalog.SessionCatalog
+
+/**
+ * Spark's own GetSchemasOperation
+ *
+ * @param sqlContext SQLContext to use
+ * @param parentSession a HiveSession from SessionManager
+ * @param catalogName catalog name. null if not applicable.
+ * @param schemaName database name, null or a concrete database name
+ */
+private[hive] class SparkGetSchemasOperation(
+sqlContext: SQLContext,
+parentSession: HiveSession,
+catalogName: String,
+schemaName: String)
+  extends GetSchemasOperation(parentSession, catalogName, schemaName) with 
Logging {
+
+  val catalog: SessionCatalog = sqlContext.sessionState.catalog
+
+  private final val RESULT_SET_SCHEMA = new TableSchema()
+.addStringColumn("TABLE_SCHEM", "Schema name.")
+.addStringColumn("TABLE_CATALOG", "Catalog name.")
+
+  private val rowSet = RowSetFactory.create(RESULT_SET_SCHEMA, 
getProtocolVersion)
+
+  private var statementId: String = _
+
+  override def close(): Unit = {
+logInfo(s"Close get schemas with $statementId")
+setState(OperationState.CLOSED)
 
 Review comment:
   The same here. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2018-12-29 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r244503143
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkGetSchemasOperation.scala
 ##
 @@ -0,0 +1,84 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.thriftserver
+
+import 
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveOperationType
+import org.apache.hive.service.cli._
+import org.apache.hive.service.cli.operation.GetSchemasOperation
+import org.apache.hive.service.cli.session.HiveSession
+
+import org.apache.spark.internal.Logging
+import org.apache.spark.sql.SQLContext
+import org.apache.spark.sql.catalyst.catalog.SessionCatalog
+
+/**
+ * Spark's own GetSchemasOperation
+ *
+ * @param sqlContext SQLContext to use
+ * @param parentSession a HiveSession from SessionManager
+ * @param catalogName catalog name. null if not applicable.
+ * @param schemaName database name, null or a concrete database name
+ */
+private[hive] class SparkGetSchemasOperation(
+sqlContext: SQLContext,
+parentSession: HiveSession,
+catalogName: String,
+schemaName: String)
+  extends GetSchemasOperation(parentSession, catalogName, schemaName) with 
Logging {
+
+  val catalog: SessionCatalog = sqlContext.sessionState.catalog
+
+  private final val RESULT_SET_SCHEMA = new TableSchema()
+.addStringColumn("TABLE_SCHEM", "Schema name.")
+.addStringColumn("TABLE_CATALOG", "Catalog name.")
+
+  private val rowSet = RowSetFactory.create(RESULT_SET_SCHEMA, 
getProtocolVersion)
+
+  override def runInternal(): Unit = {
+setState(OperationState.RUNNING)
+// Always use the latest class loader provided by executionHive's state.
+val executionHiveClassLoader = sqlContext.sharedState.jarClassLoader
+Thread.currentThread().setContextClassLoader(executionHiveClassLoader)
+
+if (isAuthV2Enabled) {
+  val cmdStr = s"catalog : $catalogName, schemaPattern : $schemaName"
+  authorizeMetaGets(HiveOperationType.GET_TABLES, null, cmdStr)
+}
+
+try {
+  catalog.listDatabases(convertSchemaPattern(schemaName)).foreach { dbName 
=>
+rowSet.addRow(Array[AnyRef](dbName, ""))
+  }
+  setState(OperationState.FINISHED)
+} catch {
+  case e: HiveSQLException =>
+setState(OperationState.ERROR)
+throw e
+}
+  }
+
+  override def getNextRowSet(orientation: FetchOrientation, maxRows: Long): 
RowSet = {
 
 Review comment:
   What happens if we do not override `getNextRowSet `?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2019-01-02 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r244802192
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkGetSchemasOperation.scala
 ##
 @@ -0,0 +1,84 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.thriftserver
+
+import 
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveOperationType
+import org.apache.hive.service.cli._
+import org.apache.hive.service.cli.operation.GetSchemasOperation
+import org.apache.hive.service.cli.session.HiveSession
+
+import org.apache.spark.internal.Logging
+import org.apache.spark.sql.SQLContext
+import org.apache.spark.sql.catalyst.catalog.SessionCatalog
+
+/**
+ * Spark's own GetSchemasOperation
+ *
+ * @param sqlContext SQLContext to use
+ * @param parentSession a HiveSession from SessionManager
+ * @param catalogName catalog name. null if not applicable.
+ * @param schemaName database name, null or a concrete database name
+ */
+private[hive] class SparkGetSchemasOperation(
+sqlContext: SQLContext,
+parentSession: HiveSession,
+catalogName: String,
+schemaName: String)
+  extends GetSchemasOperation(parentSession, catalogName, schemaName) with 
Logging {
+
+  val catalog: SessionCatalog = sqlContext.sessionState.catalog
+
+  private final val RESULT_SET_SCHEMA = new TableSchema()
+.addStringColumn("TABLE_SCHEM", "Schema name.")
+.addStringColumn("TABLE_CATALOG", "Catalog name.")
+
+  private val rowSet = RowSetFactory.create(RESULT_SET_SCHEMA, 
getProtocolVersion)
+
+  override def runInternal(): Unit = {
+setState(OperationState.RUNNING)
+// Always use the latest class loader provided by executionHive's state.
+val executionHiveClassLoader = sqlContext.sharedState.jarClassLoader
+Thread.currentThread().setContextClassLoader(executionHiveClassLoader)
+
+if (isAuthV2Enabled) {
+  val cmdStr = s"catalog : $catalogName, schemaPattern : $schemaName"
+  authorizeMetaGets(HiveOperationType.GET_TABLES, null, cmdStr)
+}
+
+try {
+  catalog.listDatabases(convertSchemaPattern(schemaName)).foreach { dbName 
=>
+rowSet.addRow(Array[AnyRef](dbName, ""))
+  }
+  setState(OperationState.FINISHED)
+} catch {
+  case e: HiveSQLException =>
+setState(OperationState.ERROR)
+throw e
+}
+  }
+
+  override def getNextRowSet(orientation: FetchOrientation, maxRows: Long): 
RowSet = {
 
 Review comment:
   Let us remove this. You just need to change this line: 
https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/GetSchemasOperation.java#L44
 
   
   to  
   ```Scala
   protected RowSet rowSet;
   ```
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2019-01-02 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r244802314
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkGetSchemasOperation.scala
 ##
 @@ -0,0 +1,84 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.thriftserver
+
+import 
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveOperationType
+import org.apache.hive.service.cli._
+import org.apache.hive.service.cli.operation.GetSchemasOperation
+import org.apache.hive.service.cli.session.HiveSession
+
+import org.apache.spark.internal.Logging
+import org.apache.spark.sql.SQLContext
+import org.apache.spark.sql.catalyst.catalog.SessionCatalog
+
+/**
+ * Spark's own GetSchemasOperation
+ *
+ * @param sqlContext SQLContext to use
+ * @param parentSession a HiveSession from SessionManager
+ * @param catalogName catalog name. null if not applicable.
+ * @param schemaName database name, null or a concrete database name
+ */
+private[hive] class SparkGetSchemasOperation(
+sqlContext: SQLContext,
+parentSession: HiveSession,
+catalogName: String,
+schemaName: String)
+  extends GetSchemasOperation(parentSession, catalogName, schemaName) with 
Logging {
+
+  val catalog: SessionCatalog = sqlContext.sessionState.catalog
+
+  private final val RESULT_SET_SCHEMA = new TableSchema()
 
 Review comment:
   Also this can be removed too. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2019-01-02 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r244802343
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkGetSchemasOperation.scala
 ##
 @@ -0,0 +1,84 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.thriftserver
+
+import 
org.apache.hadoop.hive.ql.security.authorization.plugin.HiveOperationType
+import org.apache.hive.service.cli._
+import org.apache.hive.service.cli.operation.GetSchemasOperation
+import org.apache.hive.service.cli.session.HiveSession
+
+import org.apache.spark.internal.Logging
+import org.apache.spark.sql.SQLContext
+import org.apache.spark.sql.catalyst.catalog.SessionCatalog
+
+/**
+ * Spark's own GetSchemasOperation
+ *
+ * @param sqlContext SQLContext to use
+ * @param parentSession a HiveSession from SessionManager
+ * @param catalogName catalog name. null if not applicable.
+ * @param schemaName database name, null or a concrete database name
+ */
+private[hive] class SparkGetSchemasOperation(
+sqlContext: SQLContext,
+parentSession: HiveSession,
+catalogName: String,
+schemaName: String)
+  extends GetSchemasOperation(parentSession, catalogName, schemaName) with 
Logging {
+
+  val catalog: SessionCatalog = sqlContext.sessionState.catalog
+
+  private final val RESULT_SET_SCHEMA = new TableSchema()
+.addStringColumn("TABLE_SCHEM", "Schema name.")
+.addStringColumn("TABLE_CATALOG", "Catalog name.")
+
+  private val rowSet = RowSetFactory.create(RESULT_SET_SCHEMA, 
getProtocolVersion)
 
 Review comment:
   This can be removed. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2019-01-07 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r245821601
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/GetSchemasOperation.java
 ##
 @@ -41,7 +41,7 @@
   .addStringColumn("TABLE_SCHEM", "Schema name.")
   .addStringColumn("TABLE_CATALOG", "Catalog name.");
 
-  private RowSet rowSet;
+  protected RowSet rowSet;
 
 Review comment:
   We do not have a plan to remove the thrift-server. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2019-01-07 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r245821601
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/GetSchemasOperation.java
 ##
 @@ -41,7 +41,7 @@
   .addStringColumn("TABLE_SCHEM", "Schema name.")
   .addStringColumn("TABLE_CATALOG", "Catalog name.");
 
-  private RowSet rowSet;
+  protected RowSet rowSet;
 
 Review comment:
   We do not have a plan to remove the thrift-server and use the Hive jar. 
Instead, I think we need to enhance the current thrift-server implementation. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] Implement Spark's own GetSchemasOperation

2019-01-07 Thread GitBox
gatorsmile commented on a change in pull request #22903: [SPARK-24196][SQL] 
Implement Spark's own GetSchemasOperation
URL: https://github.com/apache/spark/pull/22903#discussion_r245864597
 
 

 ##
 File path: 
sql/hive-thriftserver/src/main/java/org/apache/hive/service/cli/operation/GetSchemasOperation.java
 ##
 @@ -41,7 +41,7 @@
   .addStringColumn("TABLE_SCHEM", "Schema name.")
   .addStringColumn("TABLE_CATALOG", "Catalog name.");
 
-  private RowSet rowSet;
+  protected RowSet rowSet;
 
 Review comment:
   
https://github.com/apache/spark/commit/7feeb82cb7f462e44f7e698c7c3b6ac3a77aade4 
shows we want to further cleanup and improve the thrift-server. Even if 
https://issues.apache.org/jira/browse/HIVE-16391 is resolved, we will still 
keep the Hive thrift-server. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org