[GitHub] spark pull request #18994: [SPARK-21784][SQL] Adds support for defining info...

2017-08-24 Thread sureshthalamati
Github user sureshthalamati commented on a diff in the pull request:

https://github.com/apache/spark/pull/18994#discussion_r134944522
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -1214,6 +1246,11 @@ object HiveExternalCatalog {
 
   val CREATED_SPARK_VERSION = SPARK_SQL_PREFIX + "create.version"
 
+  val TABLE_CONSTRAINT_PREFIX = SPARK_SQL_PREFIX + "constraint."
+  val TABLE_CONSTRAINT_PRIMARY_KEY = SPARK_SQL_PREFIX + 
TABLE_CONSTRAINT_PREFIX + "pk"
+  val TABLE_NUM_FK_CONSTRAINTS = SPARK_SQL_PREFIX + "numFkConstraints"
+  val TABLE_CONSTRAINT_FOREIGNKEY_PREFIX = SPARK_SQL_PREFIX + 
TABLE_CONSTRAINT_PREFIX + "fk."
--- End diff --

Good catch. I will fix it. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18994: [SPARK-21784][SQL] Adds support for defining info...

2017-08-24 Thread sureshthalamati
Github user sureshthalamati commented on a diff in the pull request:

https://github.com/apache/spark/pull/18994#discussion_r134944438
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/TableConstraints.scala
 ---
@@ -0,0 +1,323 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.catalog
+
+import java.util.UUID
+
+import org.json4s._
+import org.json4s.JsonAST.JValue
+import org.json4s.JsonDSL._
+import org.json4s.jackson.JsonMethods._
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.TableIdentifier
+import org.apache.spark.sql.catalyst.analysis.Resolver
+import org.apache.spark.sql.types._
+import org.apache.spark.sql.util.SchemaUtils
+
+/**
+ * A container class to hold all the constraints defined on a table. Scope 
of the
+ * constraint names are at the table level.
+ */
+case class TableConstraints(
+primaryKey: Option[PrimaryKey] = None,
+foreignKeys: Seq[ForeignKey] = Seq.empty) {
+
+  /**
+   * Adds the given constraint to the existing table constraints, after 
verifying the
+   * constraint name is not a duplicate.
+   */
+  def addConstraint(constraint: TableConstraint, resolver: Resolver): 
TableConstraints = {
+if ((primaryKey.exists(pk => resolver(pk.constraintName, 
constraint.constraintName))
+  || foreignKeys.exists(fk => resolver(fk.constraintName, 
constraint.constraintName {
+  throw new AnalysisException(
+s"Failed to add constraint, duplicate constraint name 
'${constraint.constraintName}'")
+}
+constraint match {
+  case pk: PrimaryKey =>
+if (primaryKey.nonEmpty) {
+  throw new AnalysisException(
+s"Primary key '${primaryKey.get.constraintName}' already 
exists.")
+}
+this.copy(primaryKey = Option(pk))
+  case fk: ForeignKey => this.copy(foreignKeys = foreignKeys :+ fk)
+}
+  }
+}
+
+object TableConstraints {
+  /**
+   * Returns a [[TableConstraints]] containing [[PrimaryKey]] or 
[[ForeignKey]]
+   */
+  def apply(tableConstraint: TableConstraint): TableConstraints = {
+tableConstraint match {
+  case pk: PrimaryKey => TableConstraints(primaryKey = Option(pk))
+  case fk: ForeignKey => TableConstraints(foreignKeys = Seq(fk))
+}
+  }
+
+  /**
+   * Converts constraints represented in Json strings to 
[[TableConstraints]].
+   */
+  def fromJson(pkJson: Option[String], fksJson: Seq[String]): 
TableConstraints = {
+val pk = pkJson.map(pk => PrimaryKey.fromJson(parse(pk)))
+val fks = fksJson.map(fk => ForeignKey.fromJson(parse(fk)))
+TableConstraints(pk, fks)
+  }
+}
+
+/**
+ * Common type representing a table constraint.
+ */
+sealed trait TableConstraint {
+  val constraintName : String
+  val keyColumnNames : Seq[String]
+}
+
+object TableConstraint {
+  private[TableConstraint] val curId = new 
java.util.concurrent.atomic.AtomicLong(0L)
+  private[TableConstraint] val jvmId = UUID.randomUUID()
+
+  /**
+   * Generates unique constraint name to use when adding table constraints,
+   * if user does not specify a name. The `curId` field is unique within a 
given JVM,
+   * while the `jvmId` is used to uniquely identify JVMs.
+   */
+  def generateConstraintName(constraintType: String = "constraint"): 
String = {
+s"${constraintType}_${jvmId}_${curId.getAndIncrement()}"
+  }
+
+  def parseColumn(json: JValue): String = json match {
+case JString(name) => name
+case _ => json.toString
+  }
+
+  object JSortedObject {
+def unapplySeq(value: JValue): Option[List[(String, JValue)]] = value 
match {
+  case JObject(seq) => Some(seq.toList.sortBy(_._1))
+  case _ => None
+}
+  }
+
+  /**
+   * Returns 

[GitHub] spark pull request #18994: [SPARK-21784][SQL] Adds support for defining info...

2017-08-20 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/18994#discussion_r134145171
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -1214,6 +1246,11 @@ object HiveExternalCatalog {
 
   val CREATED_SPARK_VERSION = SPARK_SQL_PREFIX + "create.version"
 
+  val TABLE_CONSTRAINT_PREFIX = SPARK_SQL_PREFIX + "constraint."
+  val TABLE_CONSTRAINT_PRIMARY_KEY = SPARK_SQL_PREFIX + 
TABLE_CONSTRAINT_PREFIX + "pk"
+  val TABLE_NUM_FK_CONSTRAINTS = SPARK_SQL_PREFIX + "numFkConstraints"
+  val TABLE_CONSTRAINT_FOREIGNKEY_PREFIX = SPARK_SQL_PREFIX + 
TABLE_CONSTRAINT_PREFIX + "fk."
--- End diff --

`SPARK_SQL_PREFIX` is duplicated in `TABLE_CONSTRAINT_PRIMARY_KEY` and 
`TABLE_CONSTRAINT_FOREIGNKEY_PREFIX`.

E.g., `TABLE_CONSTRAINT_PRIMARY_KEY` is `SPARK_SQL_PREFIX` + 
`SPARK_SQL_PREFIX` + "constraint.".


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18994: [SPARK-21784][SQL] Adds support for defining info...

2017-08-20 Thread viirya
Github user viirya commented on a diff in the pull request:

https://github.com/apache/spark/pull/18994#discussion_r134144063
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/TableConstraints.scala
 ---
@@ -0,0 +1,323 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.catalog
+
+import java.util.UUID
+
+import org.json4s._
+import org.json4s.JsonAST.JValue
+import org.json4s.JsonDSL._
+import org.json4s.jackson.JsonMethods._
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.TableIdentifier
+import org.apache.spark.sql.catalyst.analysis.Resolver
+import org.apache.spark.sql.types._
+import org.apache.spark.sql.util.SchemaUtils
+
+/**
+ * A container class to hold all the constraints defined on a table. Scope 
of the
+ * constraint names are at the table level.
+ */
+case class TableConstraints(
+primaryKey: Option[PrimaryKey] = None,
+foreignKeys: Seq[ForeignKey] = Seq.empty) {
+
+  /**
+   * Adds the given constraint to the existing table constraints, after 
verifying the
+   * constraint name is not a duplicate.
+   */
+  def addConstraint(constraint: TableConstraint, resolver: Resolver): 
TableConstraints = {
+if ((primaryKey.exists(pk => resolver(pk.constraintName, 
constraint.constraintName))
+  || foreignKeys.exists(fk => resolver(fk.constraintName, 
constraint.constraintName {
+  throw new AnalysisException(
+s"Failed to add constraint, duplicate constraint name 
'${constraint.constraintName}'")
+}
+constraint match {
+  case pk: PrimaryKey =>
+if (primaryKey.nonEmpty) {
+  throw new AnalysisException(
+s"Primary key '${primaryKey.get.constraintName}' already 
exists.")
+}
+this.copy(primaryKey = Option(pk))
+  case fk: ForeignKey => this.copy(foreignKeys = foreignKeys :+ fk)
+}
+  }
+}
+
+object TableConstraints {
+  /**
+   * Returns a [[TableConstraints]] containing [[PrimaryKey]] or 
[[ForeignKey]]
+   */
+  def apply(tableConstraint: TableConstraint): TableConstraints = {
+tableConstraint match {
+  case pk: PrimaryKey => TableConstraints(primaryKey = Option(pk))
+  case fk: ForeignKey => TableConstraints(foreignKeys = Seq(fk))
+}
+  }
+
+  /**
+   * Converts constraints represented in Json strings to 
[[TableConstraints]].
+   */
+  def fromJson(pkJson: Option[String], fksJson: Seq[String]): 
TableConstraints = {
+val pk = pkJson.map(pk => PrimaryKey.fromJson(parse(pk)))
+val fks = fksJson.map(fk => ForeignKey.fromJson(parse(fk)))
+TableConstraints(pk, fks)
+  }
+}
+
+/**
+ * Common type representing a table constraint.
+ */
+sealed trait TableConstraint {
+  val constraintName : String
+  val keyColumnNames : Seq[String]
+}
+
+object TableConstraint {
+  private[TableConstraint] val curId = new 
java.util.concurrent.atomic.AtomicLong(0L)
+  private[TableConstraint] val jvmId = UUID.randomUUID()
+
+  /**
+   * Generates unique constraint name to use when adding table constraints,
+   * if user does not specify a name. The `curId` field is unique within a 
given JVM,
+   * while the `jvmId` is used to uniquely identify JVMs.
+   */
+  def generateConstraintName(constraintType: String = "constraint"): 
String = {
+s"${constraintType}_${jvmId}_${curId.getAndIncrement()}"
+  }
+
+  def parseColumn(json: JValue): String = json match {
+case JString(name) => name
+case _ => json.toString
+  }
+
+  object JSortedObject {
+def unapplySeq(value: JValue): Option[List[(String, JValue)]] = value 
match {
+  case JObject(seq) => Some(seq.toList.sortBy(_._1))
+  case _ => None
+}
+  }
+
+  /**
+   * Returns [[StructField]] 

[GitHub] spark pull request #18994: [SPARK-21784][SQL] Adds support for defining info...

2017-08-18 Thread sureshthalamati
GitHub user sureshthalamati opened a pull request:

https://github.com/apache/spark/pull/18994

[SPARK-21784][SQL] Adds support for defining information primary key and 
foreign key constraints using ALTER TABLE DDL.

## What changes were proposed in this pull request?
This PR implements ALTER TABLE DDL ADD CONSTRAINT  to  add  informational 
primary key and foreign key (referential integrity) constraints in Spark. These 
constraints will be used in query optimization and you can find more details 
about this in the spec in  
[SPARK-19842](https://issues.apache.org/jira/browse/SPARK-19842)
 
The proposed syntax of the constraints DDL is similar to the Hive  2.1 
referential integrity constraints  support 
(https://issues.apache.org/jira/browse/HIVE-13076) which is aligned to Oracle's 
semantics.
 
Syntax:
```sql
ALTER TABLE [db_name.]table_name ADD [CONSTRAINT constraintName]
  (PRIMARY KEY (col_names) |
  FOREIGN KEY (col_names) REFERENCES [db_name.]table_name [(col_names)])
  [VALIDATE | NOVALIDATE] [RELY | NORELY]
```
Examples :
 ```sql
ALTER TABLE employee ADD CONSTRANT pk  PRIMARY KEY(empno) VALIDATE RELY
ALTER TABLE department ADD CONSTRAINT emp_fk FOREIGN KEY (mgrno) REFERENCES 
employee(empno) NOVALIDATE NORELY
ALTER TABLE department ADD PRIMARY KEY(deptno) VALIDATE RELY
ALTER TABLE employee ADD FOREIGN KEY (workdept) REFERENCES 
department(deptno) VALIDATE RELY;
 ```
The constraint information is stored in the table properties as JSON string 
for each constraint. 
One of the advantages of  storing constraints in the table properties is 
that this functionality  will work in all the supported Hive metastore 
versions. 
 
An alternative approach  that we considered was to store the constraints 
information using the hive metastore API that stores the constraints in a 
separate table.  The problem with this approach is this feature will only work 
in Spark installations that use **Hive 2.1 metastore**, and also this version 
is NOT the current  spark default.  More details are in the spec document.
 
**This PR implements the  ALTER TABLE constraint DDL using table properties 
because it is important to work with default hive metastore version of the 
spark.** 
 
The syntax to define the  constraints as part of _create table_  definition 
will be implemented in a follow-up Jira.


## How was this patch tested?
Added  new unit test cases to HiveDDLSuite, and SparkSqlParserSuite


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sureshthalamati/spark 
alter_add_pk_fk_SPARK-21784

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/18994.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #18994


commit 4839e8419ca7360f0feafeceec8f3832102e3dba
Author: sureshthalamati 
Date:   2017-08-18T08:39:12Z

[SPARK-21784][SQL] Adds alter table add constraint DDL support to allow 
users define informational primary key and foreign key constraints on a table.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org