[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-05 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/12133


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-05 Thread hvanhovell
Github user hvanhovell commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-205675459
  
Merging to master. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-04 Thread gatorsmile
Github user gatorsmile commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-205645648
  
cc @yhuai @andrewor14 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-04 Thread hvanhovell
Github user hvanhovell commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-205478554
  
LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-04 Thread dilipbiswal
Github user dilipbiswal commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-205475281
  
cc @hvanhovell 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-205051147
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-205051149
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54808/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-205050801
  
**[Test build #54808 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54808/consoleFull)**
 for PR 12133 at commit 
[`09f3a53`](https://github.com/apache/spark/commit/09f3a538383d85545d0f1a809d07b4a29e133720).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-205029913
  
**[Test build #54808 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54808/consoleFull)**
 for PR 12133 at commit 
[`09f3a53`](https://github.com/apache/spark/commit/09f3a538383d85545d0f1a809d07b4a29e133720).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58315301
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,54 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table If propertyKey 
is specified, the value
+ * for the propertyKey is returned. If propertyKey is not specified, all 
the keys and their
+ * corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
--- End diff --

Thanks !! It looks much better now !! I have made the change .. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58314666
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,54 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table If propertyKey 
is specified, the value
+ * for the propertyKey is returned. If propertyKey is not specified, all 
the keys and their
+ * corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
--- End diff --

MINOR/NIT: This is still more elaborate than it needs to be. Why not:

val schema = AttributeReference("value", StringType, nullable = 
false)() :: Nil
propertyKey match {
  case None => AttributeReference("key", StringType, nullable = 
false)() :: schema
  case _ => schema
}



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204979320
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204979321
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54805/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204979092
  
**[Test build #54805 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54805/consoleFull)**
 for PR 12133 at commit 
[`c779c4b`](https://github.com/apache/spark/commit/c779c4b4dae42342e287e15018a488d3ee878310).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58309394
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
--- End diff --

Yeah. Thanks. I think it is better.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204949986
  
**[Test build #54805 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54805/consoleFull)**
 for PR 12133 at commit 
[`c779c4b`](https://github.com/apache/spark/commit/c779c4b4dae42342e287e15018a488d3ee878310).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58308812
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
--- End diff --

Its documented [here] 
(https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/interface.scala#L28-L35)
 

Its good to also add it in the SessionCatalog.getTable doc. I will add a 
comment.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58308802
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveCommandSuite.scala
 ---
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.execution
+
+import org.apache.spark.sql.{AnalysisException, QueryTest, Row}
+import org.apache.spark.sql.hive.test.TestHiveSingleton
+import org.apache.spark.sql.test.SQLTestUtils
+
+class HiveCommandSuite extends QueryTest with SQLTestUtils with 
TestHiveSingleton {
--- End diff --

Yeah you are totally right about that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58308163
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
--- End diff --

I am not sure if we ca expect all external catalogs will have this 
behavior. At least I didn't see this in getTable's comment. If it is true, I 
think it is better to say it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58308134
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveCommandSuite.scala
 ---
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.execution
+
+import org.apache.spark.sql.{AnalysisException, QueryTest, Row}
+import org.apache.spark.sql.hive.test.TestHiveSingleton
+import org.apache.spark.sql.test.SQLTestUtils
+
+class HiveCommandSuite extends QueryTest with SQLTestUtils with 
TestHiveSingleton {
--- End diff --

DROP table did not work as an example :-).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58308056
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
--- End diff --

OK.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58308055
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveCommandSuite.scala
 ---
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.execution
+
+import org.apache.spark.sql.{AnalysisException, QueryTest, Row}
+import org.apache.spark.sql.hive.test.TestHiveSingleton
+import org.apache.spark.sql.test.SQLTestUtils
+
+class HiveCommandSuite extends QueryTest with SQLTestUtils with 
TestHiveSingleton {
--- End diff --

Just curious, what kind of errors?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58308046
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
+
+propertyKey match {
+  case Some(p) =>
+val errorStr = s"Table ${catalogTable.qualifiedName} does not have 
property: $p"
+val propValue = catalogTable
+  .properties
+  .getOrElse(p, throw new AnalysisException(errorStr))
--- End diff --

There is not a syntactic/semantic/logical problem here. Lets do what you 
suggest, and return a row containing a property not found message.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58308017
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
--- End diff --

Yeah, you are right. We could also return an empty seq.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307650
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
--- End diff --

@hvanhovell Herman, so a session temp table cache entry holds a logical 
plan (df.registerTempTable). Didn't know how to get the table properties out of 
it. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307599
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
--- End diff --

Simon, you mean we should call a tableExist first before calling the 
getTable ? getTable does return a TableNotFound exception already if table is 
not found..


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307554
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveCommandSuite.scala
 ---
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.execution
+
+import org.apache.spark.sql.{AnalysisException, QueryTest, Row}
+import org.apache.spark.sql.hive.test.TestHiveSingleton
+import org.apache.spark.sql.test.SQLTestUtils
+
+class HiveCommandSuite extends QueryTest with SQLTestUtils with 
TestHiveSingleton {
--- End diff --

Yeah... I actually started writing the tests in sql/core. But ran into 
errors which is why i moved it back to hive.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307532
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
+
+propertyKey match {
+  case Some(p) =>
+val errorStr = s"Table ${catalogTable.qualifiedName} does not have 
property: $p"
+val propValue = catalogTable
+  .properties
+  .getOrElse(p, throw new AnalysisException(errorStr))
--- End diff --

Yeah.. Did think about it. In fact hive does not throw an error in this 
case. Do you prefer the resultant column to carry the exception string instead ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307516
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
+
+propertyKey match {
+  case Some(p) =>
+val errorStr = s"Table ${catalogTable.qualifiedName} does not have 
property: $p"
--- End diff --

Ok. I did this for code to be more readable. I will change it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307464
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
--- End diff --

Sure. will make the change.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307452
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
@@ -238,16 +238,20 @@ class SparkSqlAstBuilder extends AstBuilder {
 ctx.tableProperty.asScala.map { property =>
   // A key can either be a String or a collection of dot separated 
elements. We need to treat
   // these differently.
-  val key = if (property.key.STRING != null) {
-string(property.key.STRING)
-  } else {
-property.key.getText
-  }
+  val key = visitTablePropertyKey(property.key)
   val value = Option(property.value).map(string).orNull
   key -> value
 }.toMap
   }
 
+  override def visitTablePropertyKey(key: TablePropertyKeyContext): String 
= {
--- End diff --

Sure. Will do,


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307447
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
@@ -93,6 +93,23 @@ class SparkSqlAstBuilder extends AstBuilder {
   }
 
   /**
+   * A command for users to list the properties for a table.
--- End diff --

Will change it. Thanks !!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307367
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveCommandSuite.scala
 ---
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.execution
+
+import org.apache.spark.sql.{AnalysisException, QueryTest, Row}
+import org.apache.spark.sql.hive.test.TestHiveSingleton
+import org.apache.spark.sql.test.SQLTestUtils
+
+class HiveCommandSuite extends QueryTest with SQLTestUtils with 
TestHiveSingleton {
--- End diff --

Shouldn't we test most of this stuff in `sql/core` or is this hard to do 
because of the current lack of `CREATE TABLE` support?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307357
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveCommandSuite.scala
 ---
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.execution
+
+import org.apache.spark.sql.{AnalysisException, QueryTest, Row}
+import org.apache.spark.sql.hive.test.TestHiveSingleton
+import org.apache.spark.sql.test.SQLTestUtils
+
+class HiveCommandSuite extends QueryTest with SQLTestUtils with 
TestHiveSingleton {
+   protected override def beforeAll(): Unit = {
+super.beforeAll()
+sql(
+  """
+|CREATE EXTERNAL TABLE parquet_tab1 (c1 INT, c2 STRING)
+|USING org.apache.spark.sql.parquet.DefaultSource
+  """.stripMargin)
+
+ sql(
+  """
+|CREATE EXTERNAL TABLE parquet_tab2 (c1 INT, c2 STRING)
+|STORED AS PARQUET
+|TBLPROPERTIES('prop1Key'="prop1Val", '`prop2Key`'="prop2Val")
+  """.stripMargin)
+  }
+
+  override protected def afterAll(): Unit = {
+try {
+  sql("DROP TABLE IF EXISTS parquet_tab1")
+  sql("DROP TABLE IF EXISTS parquet_tab2")
+} finally {
+  super.afterAll()
+}
+  }
+
+  test("show tables") {
--- End diff --

Nevermind - I got it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307336
  
--- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveCommandSuite.scala
 ---
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.hive.execution
+
+import org.apache.spark.sql.{AnalysisException, QueryTest, Row}
+import org.apache.spark.sql.hive.test.TestHiveSingleton
+import org.apache.spark.sql.test.SQLTestUtils
+
+class HiveCommandSuite extends QueryTest with SQLTestUtils with 
TestHiveSingleton {
+   protected override def beforeAll(): Unit = {
+super.beforeAll()
+sql(
+  """
+|CREATE EXTERNAL TABLE parquet_tab1 (c1 INT, c2 STRING)
+|USING org.apache.spark.sql.parquet.DefaultSource
+  """.stripMargin)
+
+ sql(
+  """
+|CREATE EXTERNAL TABLE parquet_tab2 (c1 INT, c2 STRING)
+|STORED AS PARQUET
+|TBLPROPERTIES('prop1Key'="prop1Val", '`prop2Key`'="prop2Val")
+  """.stripMargin)
+  }
+
+  override protected def afterAll(): Unit = {
+try {
+  sql("DROP TABLE IF EXISTS parquet_tab1")
+  sql("DROP TABLE IF EXISTS parquet_tab2")
+} finally {
+  super.afterAll()
+}
+  }
+
+  test("show tables") {
--- End diff --

Why is this test here? You are testing `SHOW TABLES` commands.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204923676
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204923679
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54798/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204923627
  
**[Test build #54798 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54798/consoleFull)**
 for PR 12133 at commit 
[`8ab51ea`](https://github.com/apache/spark/commit/8ab51ea3152b945f84b59f36013931da0f78f5fb).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307310
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
+
+propertyKey match {
+  case Some(p) =>
+val errorStr = s"Table ${catalogTable.qualifiedName} does not have 
property: $p"
+val propValue = catalogTable
+  .properties
+  .getOrElse(p, throw new AnalysisException(errorStr))
--- End diff --

Do we want to throw an Exception here? It seems harsh. We could also just 
return an row with the exception in it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307277
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
+
+propertyKey match {
+  case Some(p) =>
+val errorStr = s"Table ${catalogTable.qualifiedName} does not have 
property: $p"
--- End diff --

This is a tiny bit more expensive than it needs to be. Since we are 
creating error, before we even now things will go wrong.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307261
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
--- End diff --

Why not? Can't temp tables have properties?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307250
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
--- End diff --

I think this is easier to understand by doing a pattern match of the 
`propertyKey`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307226
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
@@ -93,6 +93,23 @@ class SparkSqlAstBuilder extends AstBuilder {
   }
 
   /**
+   * A command for users to list the properties for a table.
--- End diff --

NIT: width 60 instead of 100?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307196
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
@@ -238,16 +238,20 @@ class SparkSqlAstBuilder extends AstBuilder {
 ctx.tableProperty.asScala.map { property =>
   // A key can either be a String or a collection of dot separated 
elements. We need to treat
   // these differently.
-  val key = if (property.key.STRING != null) {
-string(property.key.STRING)
-  } else {
-property.key.getText
-  }
+  val key = visitTablePropertyKey(property.key)
   val value = Option(property.value).map(string).orNull
   key -> value
 }.toMap
   }
 
+  override def visitTablePropertyKey(key: TablePropertyKeyContext): String 
= {
--- End diff --

Could you add a little bit of documentation for this method. The comment on 
line 239 is a good starting point.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58307086
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/commands.scala 
---
@@ -374,6 +374,52 @@ case class ShowDatabasesCommand(databasePattern: 
Option[String]) extends Runnabl
 }
 
 /**
+ * A command for users to list the properties for a table.
+ * If propertyKey is specified, the value for the propertyKey
+ * is returned. If propertyKey is not specified, all the keys
+ * and their corresponding values are returned.
+ * The syntax of using this command in SQL is:
+ * {{{
+ *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+ * }}}
+ */
+case class ShowTablePropertiesCommand(
+table: TableIdentifier,
+propertyKey: Option[String]) extends RunnableCommand {
+
+  override val output: Seq[Attribute] = {
+val withKeySchema: Seq[Attribute] = {
+  AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+val noKeySchema: Seq[Attribute] = {
+  AttributeReference("key", StringType, nullable = false)() ::
+AttributeReference("value", StringType, nullable = false)() :: Nil
+}
+propertyKey.map(p => withKeySchema).getOrElse(noKeySchema)
+  }
+
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+val catalog = sqlContext.sessionState.catalog
+
+if (catalog.isTemporaryTable(table)) {
+  throw new AnalysisException("This operation is unsupported for 
temporary tables")
+}
+val catalogTable = sqlContext.sessionState.catalog.getTable(table)
--- End diff --

I'd use `tableExists` to check if the table exists. If not, throw an 
analysis exception.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-03 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204898219
  
**[Test build #54798 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54798/consoleFull)**
 for PR 12133 at commit 
[`8ab51ea`](https://github.com/apache/spark/commit/8ab51ea3152b945f84b59f36013931da0f78f5fb).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204860955
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54789/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204860953
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204860912
  
**[Test build #54789 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54789/consoleFull)**
 for PR 12133 at commit 
[`6b7bf59`](https://github.com/apache/spark/commit/6b7bf59ab6164a6d9021aeb318baa13fb0d32612).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204847481
  
**[Test build #54789 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54789/consoleFull)**
 for PR 12133 at commit 
[`6b7bf59`](https://github.com/apache/spark/commit/6b7bf59ab6164a6d9021aeb318baa13fb0d32612).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204813187
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204813188
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54783/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204812704
  
**[Test build #54783 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54783/consoleFull)**
 for PR 12133 at commit 
[`386f492`](https://github.com/apache/spark/commit/386f492533199a4ed35d873d24438a9e83299160).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds the following public classes _(experimental)_:
  * `case class ShowTablePropertiesCommand(`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58300131
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
@@ -93,6 +93,23 @@ class SparkSqlAstBuilder extends AstBuilder {
   }
 
   /**
+   * A command for users to list the properties for a table.
+   * If propertyKey is specified, the value for the propertyKey
+   * is returned. If propertyKey is not specified, all the keys
+   * and their corresponding values are returned.
+   * The syntax of using this command in SQL is:
+   * {{{
+   *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+   * }}}
+   */
+  override def visitShowTblProperties(
+  ctx: ShowTblPropertiesContext): LogicalPlan = withOrigin(ctx) {
+ShowTablePropertiesCommand(
+  visitTableIdentifier(ctx.tableIdentifier),
+  Option(ctx.key).map(_.STRING).map(string))
--- End diff --

Oh.. missed it. Thank you Herman. I will fix as per your suggestion.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58299861
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
@@ -93,6 +93,23 @@ class SparkSqlAstBuilder extends AstBuilder {
   }
 
   /**
+   * A command for users to list the properties for a table.
+   * If propertyKey is specified, the value for the propertyKey
+   * is returned. If propertyKey is not specified, all the keys
+   * and their corresponding values are returned.
+   * The syntax of using this command in SQL is:
+   * {{{
+   *   SHOW TBLPROPERTIES table_name[('propertyKey')];
+   * }}}
+   */
+  override def visitShowTblProperties(
+  ctx: ShowTblPropertiesContext): LogicalPlan = withOrigin(ctx) {
+ShowTablePropertiesCommand(
+  visitTableIdentifier(ctx.tableIdentifier),
+  Option(ctx.key).map(_.STRING).map(string))
--- End diff --

This will only work STRING keys, whereas a partition key can also be an 
identifier. See: 
https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala#L223-L229

Lets move parsing a `tablePropertyKey` into a seperate method (override 
`visitTablePropertyKey`) and use this for both cases.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread dilipbiswal
Github user dilipbiswal commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58299837
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -272,6 +272,22 @@ class SessionCatalog(
   }
 
   /**
+   * Return whether a table with the specified name is a temporary table.
+   *
+   * Note: The temporary table cache is checked only when database is not
+   * explicitly specified.
+   */
+  def isTemporaryTable(name: TableIdentifier): Boolean = {
+val db = name.database.getOrElse(currentDb)
+val table = formatTableName(name.table)
+if (!name.database.isDefined && tempTables.contains(table)) {
--- End diff --

Thanks.. will make the change.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread gatorsmile
Github user gatorsmile commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58299804
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -272,6 +272,22 @@ class SessionCatalog(
   }
 
   /**
+   * Return whether a table with the specified name is a temporary table.
+   *
+   * Note: The temporary table cache is checked only when database is not
+   * explicitly specified.
+   */
+  def isTemporaryTable(name: TableIdentifier): Boolean = {
+val db = name.database.getOrElse(currentDb)
+val table = formatTableName(name.table)
+if (!name.database.isDefined && tempTables.contains(table)) {
--- End diff --

yeah, please do `!name.database.isDefined && tempTables.contains(table)`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/12133#discussion_r58299767
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala
 ---
@@ -272,6 +272,22 @@ class SessionCatalog(
   }
 
   /**
+   * Return whether a table with the specified name is a temporary table.
+   *
+   * Note: The temporary table cache is checked only when database is not
+   * explicitly specified.
+   */
+  def isTemporaryTable(name: TableIdentifier): Boolean = {
+val db = name.database.getOrElse(currentDb)
+val table = formatTableName(name.table)
+if (!name.database.isDefined && tempTables.contains(table)) {
--- End diff --

`if/else` is trivial


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12133#issuecomment-204802313
  
**[Test build #54783 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54783/consoleFull)**
 for PR 12133 at commit 
[`386f492`](https://github.com/apache/spark/commit/386f492533199a4ed35d873d24438a9e83299160).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14348][SQL] Support native execution of...

2016-04-02 Thread dilipbiswal
GitHub user dilipbiswal opened a pull request:

https://github.com/apache/spark/pull/12133

[SPARK-14348][SQL] Support native execution of SHOW TBLPROPERTIES command

## What changes were proposed in this pull request?

This PR adds Native execution of SHOW TBLPROPERTIES command.

Command Syntax:
``` SQL
SHOW TBLPROPERTIES table_name[(property_key_literal)]
```
## How was this patch tested?

Tests added in HiveComandSuiie and DDLCommandSuite



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/dilipbiswal/spark dkb_show_tblproperties

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/12133.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #12133


commit 386f492533199a4ed35d873d24438a9e83299160
Author: Dilip Biswal 
Date:   2016-04-01T06:56:09Z

[SPARK-14348] Support native execution of SHOW TBLPROPERTIES command




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org