[GitHub] spark pull request: [SPARK-13420][SQL] Rename Subquery logical pla...

2016-02-21 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/11288


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13420][SQL] Rename Subquery logical pla...

2016-02-21 Thread rxin
Github user rxin commented on the pull request:

https://github.com/apache/spark/pull/11288#issuecomment-186895449
  
Thanks - merging in master.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [MINOR][DOCS] Fix typos in `NOTICE`, `configur...

2016-02-21 Thread dongjoon-hyun
Github user dongjoon-hyun commented on the pull request:

https://github.com/apache/spark/pull/11289#issuecomment-186894117
  
Oh, thank you, @srowen .


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread xguo27
Github user xguo27 commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186893918
  
@hvanhovell In hashSemiJoin() function, when condition is empty, the 
boundCondition always evaluates to true here:


https://github.com/apache/spark/blob/8f744fe3d931c2380613b8e5bafa1bb1fd292839/sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashSemiJoin.scala#L42-L43

so the exists{...} part of these lines behaves as a No-Op.


https://github.com/apache/spark/blob/8f744fe3d931c2380613b8e5bafa1bb1fd292839/sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashSemiJoin.scala#L87-L89


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13420][SQL] Rename Subquery logical pla...

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on the pull request:

https://github.com/apache/spark/pull/11288#issuecomment-186893722
  
LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186892930
  
**[Test build #2558 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/2558/consoleFull)**
 for PR 11291 at commit 
[`a84975a`](https://github.com/apache/spark/commit/a84975a5fcee4b59cd144e23cca806970dc58164).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread rxin
Github user rxin commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53571306
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/execution/SparkQlSuite.scala ---
@@ -0,0 +1,652 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution
+
+import org.apache.spark.sql.catalyst.TableIdentifier
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.plans.PlanTest
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources.BucketSpec
+import org.apache.spark.sql.types._
+
+class SparkQlSuite extends PlanTest {
--- End diff --

why is this named SparkQlSuite?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread rxin
Github user rxin commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53571294
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/commands.scala 
---
@@ -0,0 +1,533 @@
+/*
--- End diff --

+1


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53571196
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53571180
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186888525
  
I don't think I have enough karma to trigger a build... @rxin could you 
trigger a build?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186885302
  
@xguo27 Could you also take a look at the `hashSemiJoin(...)` function and 
see how this treats an empty condition?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11291#discussion_r53571028
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/joins/BroadcastLeftSemiJoinHash.scala
 ---
@@ -45,18 +45,15 @@ case class BroadcastLeftSemiJoinHash(
   row.copy()
 }.collect()
 
-if (condition.isEmpty) {
-  val hashSet = buildKeyHashSet(input.toIterator)
-  val broadcastedRelation = sparkContext.broadcast(hashSet)
+val hashRelation =
+  HashedRelation(input.toIterator, rightKeyGenerator, input.size)
+val broadcastedRelation = sparkContext.broadcast(hashRelation)
 
+if (condition.isEmpty) {
--- End diff --

I think it is safe to remove the entire block.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13423] [WIP] [CORE] [SQL] [STREAMING] S...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11292#issuecomment-186882750
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51642/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13423] [WIP] [CORE] [SQL] [STREAMING] S...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11292#issuecomment-186882748
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13423] [WIP] [CORE] [SQL] [STREAMING] S...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11292#issuecomment-186882701
  
**[Test build #51642 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51642/consoleFull)**
 for PR 11292 at commit 
[`adba76c`](https://github.com/apache/spark/commit/adba76cb0e516c7d6fd4127d15458d7a5cea2acf).
 * This patch **fails to build**.
 * This patch merges cleanly.
 * This patch adds the following public classes _(experimental)_:
  * `  static class ChainedIterator extends UnsafeSorterIterator `


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13423] [WIP] [CORE] [SQL] [STREAMING] S...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11292#issuecomment-186880803
  
**[Test build #51642 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51642/consoleFull)**
 for PR 11292 at commit 
[`adba76c`](https://github.com/apache/spark/commit/adba76cb0e516c7d6fd4127d15458d7a5cea2acf).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread xguo27
Github user xguo27 commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186878372
  
@hvanhovell  I see, sorry for my lack of patience. : )


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186876984
  
It usually takes about 15 min before a test is triggered


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread xguo27
Github user xguo27 commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186876120
  
Looks like the command did not trigger a test?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13117][Web UI] WebUI should use the loc...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11133#issuecomment-186875930
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51639/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13117][Web UI] WebUI should use the loc...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11133#issuecomment-186875929
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13117][Web UI] WebUI should use the loc...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11133#issuecomment-186875823
  
**[Test build #51639 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51639/consoleFull)**
 for PR 11133 at commit 
[`26489fc`](https://github.com/apache/spark/commit/26489fc8a3eba53c92f042472b384d2cacfbc4b8).
 * This patch **fails Spark unit tests**.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13423] [WIP] [CORE] [SQL] [STREAMING] S...

2016-02-21 Thread srowen
GitHub user srowen opened a pull request:

https://github.com/apache/spark/pull/11292

[SPARK-13423] [WIP] [CORE] [SQL] [STREAMING] Static analysis fixes for 2.x

## What changes were proposed in this pull request?

Make some cross-cutting code improvements according to static analysis. 
These are individually up for discussion since they exist in separate commits 
that can be reverted. The changes are broadly:

- Inner class should be static
- Mismatched hashCode/equals
- Overflow in compareTo
- Unchecked warnings
- Misuse of assert, vs junit.assert
- get(a) + getOrElse(b) -> getOrElse(a,b)
- Array/String .size -> .length (occasionally, -> .isEmpty / .nonEmpty) to 
avoid implicit conversions
- Dead code
- tailrec
- exists(_ == ) -> contains find + nonEmpty -> exists filter + size -> count
- reduce(_+_) -> sum map + flatten -> map

The most controversial may be .size -> .length simply because of its size. 
It is intended to avoid implicits that might be expensive in some places. 

## How was the this patch tested?

Existing Jenkins unit tests.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/srowen/spark SPARK-13423

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/11292.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #11292


commit e5b7051976469a02660a4172cff8dc2d83baa139
Author: Sean Owen 
Date:   2016-02-20T22:02:26Z

reduce(_+_) -> sum
map + flatten -> map

commit bb39c40bffbbed991d37bb3d6dc568521d715f92
Author: Sean Owen 
Date:   2016-02-20T22:05:22Z

exists(_ == ) -> contains
find + nonEmpty -> exists
filter + size -> count

commit 60377a77e4d0003070ec25a5a35228ae44737665
Author: Sean Owen 
Date:   2016-02-20T22:21:09Z

tailrec

commit b7d5bd0cdc35b304daf8842f15d01a344d854849
Author: Sean Owen 
Date:   2016-02-21T11:35:57Z

Dead code

commit 16bff2e5aab2776496556275a1d249ac9ca571a1
Author: Sean Owen 
Date:   2016-02-21T11:49:25Z

Array/String .size -> .length (occassionally, -> .isEmpty / .nonEmpty) to 
avoid implicit conversions

commit 7b54fe53ee7e3f64cee2d5cd138e2740403bd836
Author: Sean Owen 
Date:   2016-02-21T14:34:17Z

Revert some incorrect uses of tailrec

commit 93fc9268f52bb8260168ff2c82d8f016478e5e48
Author: Sean Owen 
Date:   2016-02-21T14:34:35Z

get(a) + getOrElse(b) -> getOrElse(a,b)

commit d90aa66cd7b7ae667a74dd424f8c1ac2abd9b25e
Author: Sean Owen 
Date:   2016-02-21T15:18:08Z

Misuse of assert, vs junit.assert

commit b07e01f7098e65eaf55ef74ad420f2af700df38f
Author: Sean Owen 
Date:   2016-02-21T15:23:49Z

Unchecked warnings

commit a003eae6bf1ebe6cb29e46c4fa8ff0e40189c7c2
Author: Sean Owen 
Date:   2016-02-21T15:29:23Z

More misuse of assert

commit a6b1669b903ca0395b6a55a7dff50d1e0311eace
Author: Sean Owen 
Date:   2016-02-21T15:32:00Z

Overflow in compareTo

commit 1bcbaa51d599a3fd192fe4a208c5db9d478be690
Author: Sean Owen 
Date:   2016-02-21T15:33:34Z

Mismatched hashCode/equals

commit adba76cb0e516c7d6fd4127d15458d7a5cea2acf
Author: Sean Owen 
Date:   2016-02-21T15:35:35Z

Inner class should be static




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186875493
  
ok to test


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186872985
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread xguo27
Github user xguo27 commented on the pull request:

https://github.com/apache/spark/pull/11291#issuecomment-186872660
  
@hvanhovell Could you please advise whether this is the right fix? All Left 
Semi related tests passed, but I'm not sure what other impact there might be to 
remove HashSet related methods.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13422][SQL] Use HashedRelation instead ...

2016-02-21 Thread xguo27
GitHub user xguo27 opened a pull request:

https://github.com/apache/spark/pull/11291

[SPARK-13422][SQL] Use HashedRelation instead of HashSet in Left Semi Joins



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/xguo27/spark SPARK-13422

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/11291.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #11291


commit a84975a5fcee4b59cd144e23cca806970dc58164
Author: Xiu Guo 
Date:   2016-02-21T17:10:01Z

[SPARK-13422][SQL] Use HashedRelation instead of HashSet in Left Semi Joins




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53570110
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53570023
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53570024
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53570001
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569784
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13233][SQL][WIP] Python Dataset

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/7#issuecomment-186862480
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569740
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569754
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13233][SQL][WIP] Python Dataset

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/7#issuecomment-186862481
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51638/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569704
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13233][SQL][WIP] Python Dataset

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/7#issuecomment-186862424
  
**[Test build #51638 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51638/consoleFull)**
 for PR 7 at commit 
[`349b119`](https://github.com/apache/spark/commit/349b119d67f03d1fbd86374555d4c3486841985b).
 * This patch **fails PySpark unit tests**.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13016] [Documentation] Replace example ...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11132#issuecomment-186862379
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51641/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13016] [Documentation] Replace example ...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11132#issuecomment-186862377
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13016] [Documentation] Replace example ...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11132#issuecomment-186862336
  
**[Test build #51641 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51641/consoleFull)**
 for PR 11132 at commit 
[`3b031da`](https://github.com/apache/spark/commit/3b031da55b81b9440539dfe7ee9579fe4c656b9d).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569677
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-7729][UI]Executor which has been killed...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/10058#issuecomment-186862236
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-7729][UI]Executor which has been killed...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/10058#issuecomment-186862237
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51636/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-7729][UI]Executor which has been killed...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/10058#issuecomment-186862167
  
**[Test build #51636 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51636/consoleFull)**
 for PR 10058 at commit 
[`96950c6`](https://github.com/apache/spark/commit/96950c610a4325bebb9c6c41bb6faaeb00c7fa3e).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569610
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-3650][GraphX] Triangle Count handles re...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11290#issuecomment-186860314
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-3650][GraphX] Triangle Count handles re...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11290#issuecomment-186860315
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51640/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-3650][GraphX] Triangle Count handles re...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11290#issuecomment-186860282
  
**[Test build #51640 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51640/consoleFull)**
 for PR 11290 at commit 
[`1e6f5d2`](https://github.com/apache/spark/commit/1e6f5d2e01ea7902a1c7ec5261e21e855ed8b073).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13016] [Documentation] Replace example ...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11132#issuecomment-186859997
  
**[Test build #51641 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51641/consoleFull)**
 for PR 11132 at commit 
[`3b031da`](https://github.com/apache/spark/commit/3b031da55b81b9440539dfe7ee9579fe4c656b9d).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569457
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569392
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13136][SQL] Create a dedicated Broadcas...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11083#issuecomment-186858134
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51637/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13136][SQL] Create a dedicated Broadcas...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11083#issuecomment-186858132
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13136][SQL] Create a dedicated Broadcas...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11083#issuecomment-186857661
  
**[Test build #51637 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51637/consoleFull)**
 for PR 11083 at commit 
[`c8c175e`](https://github.com/apache/spark/commit/c8c175e91ad2896573a4d6efab9ee13d7f28103c).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569287
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13399][STREAMING] Fix checkpointsuite t...

2016-02-21 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/11286#discussion_r53569281
  
--- Diff: 
streaming/src/test/scala/org/apache/spark/streaming/CheckpointSuite.scala ---
@@ -133,6 +133,17 @@ trait DStreamCheckpointTester { self: SparkFunSuite =>
 new StreamingContext(SparkContext.getOrCreate(conf), batchDuration)
   }
 
+  /**
+   * Get the first TestOutputStreamWithPartitions, does not check the 
provided generic type.
+   */
+  protected def getTestOutputStream[V: ClassTag](streams: 
Array[DStream[_]]):
+TestOutputStreamWithPartitions[V] = {
+streams.collect{
--- End diff --

Seems OK. I was going to ask if you can use `find()` here but I don't think 
it quite applies in the same way. Let's see if Jenkins makes you put a space 
before the brace


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13136][SQL] Create a dedicated Broadcas...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11083#issuecomment-186856561
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13136][SQL] Create a dedicated Broadcas...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11083#issuecomment-186856563
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51635/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569257
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13136][SQL] Create a dedicated Broadcas...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11083#issuecomment-186856355
  
**[Test build #51635 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51635/consoleFull)**
 for PR 11083 at commit 
[`4b5978b`](https://github.com/apache/spark/commit/4b5978b57552083d701a431c4de9145aca86d677).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds the following public classes _(experimental)_:
  * `trait BroadcastMode `


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-3650][GraphX] Triangle Count handles re...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11290#issuecomment-186856257
  
**[Test build #51640 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51640/consoleFull)**
 for PR 11290 at commit 
[`1e6f5d2`](https://github.com/apache/spark/commit/1e6f5d2e01ea7902a1c7ec5261e21e855ed8b073).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13398][STREAMING] Change FileWriteAhead...

2016-02-21 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/11287#issuecomment-186856227
  
This looks conceptually fine, though the test failed. I am not sure why. I 
stared at this for a while and don't see what would cause it to not be limited 
to 8 threads in the test.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569223
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569202
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569174
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569109
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
--- End diff --

Rest is Nil? Same goes for every other rest instance


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or 

[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569098
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
--- End diff --

Don't need rename variable. Rest is Nil?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13117][Web UI] WebUI should use the loc...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11133#issuecomment-186855426
  
**[Test build #51639 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51639/consoleFull)**
 for PR 11133 at commit 
[`26489fc`](https://github.com/apache/spark/commit/26489fc8a3eba53c92f042472b384d2cacfbc4b8).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53569083
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
--- End diff --

use `rest`? Is it possible to defer partition parsing until later on?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13012] [Documentation] Replace example ...

2016-02-21 Thread devaraj-kavali
Github user devaraj-kavali commented on the pull request:

https://github.com/apache/spark/pull/11053#issuecomment-186855380
  
Thanks @yinxusen for the good suggestion, I have addressed it.

> ModelSelectionViaTrainValidationSplitExample and 
JavaModelSelectionViaTrainValidationSplitExample still have a problem of Vector 
serialization. But I think we can add follow-up JIRA to locate the bug and fix 
it.

Yes, we can create an another followup JIRA to fix the problem. Thank you.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-3650][GraphX] Triangle Count handles re...

2016-02-21 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/11290#issuecomment-186855315
  
Jenkins, test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12746][ML] ArrayType(_, true) should al...

2016-02-21 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/11237#issuecomment-186855236
  
OK by me though ideally @mengxr would merge. I'll do it tomorrow if not


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [STREAMING][DOCS] Fixes and code improvements ...

2016-02-21 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/11201#discussion_r53568953
  
--- Diff: docs/streaming-programming-guide.md ---
@@ -1787,11 +1788,11 @@ reliable file system (e.g., HDFS, S3, etc.) to 
which the checkpoint information
 This is done by using `streamingContext.checkpoint(checkpointDirectory)`. 
This will allow you to
 use the aforementioned stateful transformations. Additionally,
 if you want to make the application recover from driver failures, you 
should rewrite your
-streaming application to have the following behavior.
+streaming application to have the following behavior:
--- End diff --

A couple of the trivial formatting changes I agree with; most of the 
changes are not improvements (e.g. "being" is slightly more accurate below). I 
would close this PR


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13117][Web UI] WebUI should use the loc...

2016-02-21 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/11133#issuecomment-186854471
  
@devaraj-kavali Let's try one more time. I'm not seeing that failure in 
other builds. Although the exception's ultimate cause isn't so clear, it could 
still be due to this change.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13248][STREAMING] Remove deprecated Str...

2016-02-21 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/11139


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13117][Web UI] WebUI should use the loc...

2016-02-21 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/11133#issuecomment-186854431
  
Jenkins, test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13248][STREAMING] Remove deprecated Str...

2016-02-21 Thread srowen
Github user srowen commented on the pull request:

https://github.com/apache/spark/pull/11139#issuecomment-186854347
  
Merged to master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53568615
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
--- End diff --

The pattern `unquoteString(cleanIdentifier(...))` is used often. Create a 
method?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53568504
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/parsers.scala 
---
@@ -0,0 +1,420 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.sql.{AnalysisException, SaveMode}
+import org.apache.spark.sql.catalyst.{CatalystQl, PlanParser, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
+import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
+import org.apache.spark.sql.execution.datasources._
+import org.apache.spark.sql.types.StructType
+
+case class AlterTableCommandParser(base: CatalystQl) extends PlanParser {
+
+  def parsePartitionSpec(node: ASTNode): Option[Map[String, 
Option[String]]] = {
+node match {
+  case Token("TOK_PARTSPEC", partitions) =>
+val spec = partitions.map {
+  case Token("TOK_PARTVAL", ident :: constant :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)),
+  Some(unquoteString(cleanIdentifier(constant.text
+  case Token("TOK_PARTVAL", ident :: Nil) =>
+(unquoteString(cleanIdentifier(ident.text)), None)
+}.toMap
+Some(spec)
+  case _ => None
+}
+  }
+
+  def extractTableProps(node: ASTNode): Map[String, Option[String]] = node 
match {
+case Token("TOK_TABLEPROPERTIES", propsList) =>
+  propsList.flatMap {
+case Token("TOK_TABLEPROPLIST", props) =>
+  props.map {
+case Token("TOK_TABLEPROPERTY", key :: Token("TOK_NULL", Nil) 
:: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  (k, None)
+case Token("TOK_TABLEPROPERTY", key :: value :: Nil) =>
+  val k = unquoteString(cleanIdentifier(key.text))
+  val v = unquoteString(cleanIdentifier(value.text))
+  (k, Some(v))
+  }
+  }.toMap
+  }
+
+  override def isDefinedAt(node: ASTNode): Boolean = node.text == 
"TOK_ALTERTABLE"
+
+  override def apply(v1: ASTNode): LogicalPlan = v1.children match {
+case (tabName @ Token("TOK_TABNAME", _)) :: rest =>
+  val tableIdent: TableIdentifier = base.extractTableIdent(tabName)
+  val partitionSpec = base.getClauseOption("TOK_PARTSPEC", v1.children)
+  val partition = partitionSpec.flatMap(parsePartitionSpec)
+  matchAlterTableCommands(v1, rest, tableIdent, partition)
+case _ =>
+  throw new NotImplementedError(v1.text)
+  }
+
+  def matchAlterTableCommands(
+  node: ASTNode,
+  nodes: Seq[ASTNode],
+  tableIdent: TableIdentifier,
+  partition: Option[Map[String, Option[String]]]): LogicalPlan = nodes 
match {
+case rename @ Token("TOK_ALTERTABLE_RENAME", renameArgs) :: rest =>
+  val renamedTable = base.getClause("TOK_TABNAME", renameArgs)
+  val renamedTableIdent: TableIdentifier = 
base.extractTableIdent(renamedTable)
+  AlterTableRename(tableIdent, renamedTableIdent)(node.source)
+
+case Token("TOK_ALTERTABLE_PROPERTIES", args) :: rest =>
+  val setTableProperties = extractTableProps(args.head)
+  AlterTableSetProperties(
+tableIdent,
+setTableProperties)(node.source)
+
+case Token("TOK_ALTERTABLE_DROPPROPERTIES", args) :: rest =>
+  val dropTableProperties = extractTableProps(args.head)
+  val allowExisting = 

[GitHub] spark pull request: [SPARK-12567][SQL] Add aes_{encrypt,decrypt} U...

2016-02-21 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/10527#discussion_r53568446
  
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
@@ -1931,6 +1931,42 @@ object functions extends LegacyFunctions {
 new Murmur3Hash(cols.map(_.expr))
   }
 
+  /**
+   * Encrypts input using AES and Returns the result as a binary column.
+   * Key lengths of 128, 192 or 256 bits can be used. 192 and 256 bits 
keys can be used if Java
+   * Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy 
Files are installed. If
+   * either argument is NULL, the result will also be null. If input is 
invalid, key length is not
+   * one of the permitted values or using 192/256 bits key before 
installing JCE, an exception will
+   * be thrown.
+   *
+   * @param input binary column to encrypt input
+   * @param key binary column of 128, 192 or 256 bits key
+   *
+   * @group misc_funcs
+   * @since 2.0.0
+   */
+  def aes_encrypt(input: Column, key: Column): Column = withExpr {
--- End diff --

Actually we do have a mess of it. Generally we have 3 kinds of parameters: 
`Column`, column name string, literal. For example, `pow` takes 2 parameters, 
and we have 8 overloaded versions of it, which are all combinations of these 3 
kinds of parameters except both literals. However, some functions like `pmod`, 
only have both `Column` version. Some functions like `sha2` only have `Column`, 
literal combination.

Personly I think for something like `sha2` and this one, where one 
parameter will be literal 99% of the time, we should just give a `Column`, 
literal combination. i.e. `def aes_encrypt(input: Column, key: String)`. For 
something like `pow`, we only need to provide the both `Column` version, as 
column name string and literal are easy to parse to `Column`, by `col()` and 
`lit()`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53568236
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/commands.scala 
---
@@ -0,0 +1,533 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.commands
+
+import java.util.NoSuchElementException
+
+import org.apache.spark.Logging
+import org.apache.spark.rdd.RDD
+import org.apache.spark.sql.{DataFrame, Row, SQLConf, SQLContext}
+import org.apache.spark.sql.catalyst.{CatalystTypeConverters, InternalRow, 
TableIdentifier}
+import org.apache.spark.sql.catalyst.errors.TreeNodeException
+import org.apache.spark.sql.catalyst.expressions.{Attribute, 
AttributeReference}
+import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
+import org.apache.spark.sql.execution._
+import org.apache.spark.sql.execution.datasources.BucketSpec
+import org.apache.spark.sql.types._
+
+abstract class NativeDDLCommands(val sql: String) extends RunnableCommand {
+  override def run(sqlContext: SQLContext): Seq[Row] = {
+sqlContext.catalog.runNativeCommand(sql)
+  }
+
+  override val output: Seq[Attribute] =
+Seq(AttributeReference("result", StringType, nullable = false)())
+}
+
+case class SetCommand(kv: Option[(String, Option[String])]) extends 
RunnableCommand with Logging {
+
+  private def keyValueOutput: Seq[Attribute] = {
+val schema = StructType(
+  StructField("key", StringType, false) ::
+StructField("value", StringType, false) :: Nil)
+schema.toAttributes
+  }
+
+  private val (_output, runFunc): (Seq[Attribute], SQLContext => Seq[Row]) 
= kv match {
+// Configures the deprecated "mapred.reduce.tasks" property.
+case Some((SQLConf.Deprecated.MAPRED_REDUCE_TASKS, Some(value))) =>
+  val runFunc = (sqlContext: SQLContext) => {
+logWarning(
+  s"Property ${SQLConf.Deprecated.MAPRED_REDUCE_TASKS} is 
deprecated, " +
+s"automatically converted to ${SQLConf.SHUFFLE_PARTITIONS.key} 
instead.")
+if (value.toInt < 1) {
+  val msg =
+s"Setting negative ${SQLConf.Deprecated.MAPRED_REDUCE_TASKS} 
for automatically " +
+  "determining the number of reducers is not supported."
+  throw new IllegalArgumentException(msg)
+} else {
+  sqlContext.setConf(SQLConf.SHUFFLE_PARTITIONS.key, value)
+  Seq(Row(SQLConf.SHUFFLE_PARTITIONS.key, value))
+}
+  }
+  (keyValueOutput, runFunc)
+
+case Some((SQLConf.Deprecated.EXTERNAL_SORT, Some(value))) =>
--- End diff --

A lot of duplicate code here. Why not put it in a collection and do a 
lookup?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53568197
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/commands/commands.scala 
---
@@ -0,0 +1,533 @@
+/*
--- End diff --

Shouldn't we split this file into a number of smaller units? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53568183
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkQl.scala ---
@@ -62,6 +87,57 @@ private[sql] class SparkQl(conf: ParserConf = 
SimpleParserConf()) extends Cataly
 val tableIdent = extractTableIdent(nameParts)
 RefreshTable(tableIdent)
 
+  case Token("TOK_CREATEDATABASE", Token(databaseName, Nil) :: 
createDatabaseArgs) =>
+val Seq(
+  allowExisting,
+  dbLocation,
+  databaseComment,
+  dbprops) = getClauses(Seq(
+  "TOK_IFNOTEXISTS",
+  "TOK_DATABASELOCATION",
+  "TOK_DATABASECOMMENT",
+  "TOK_DATABASEPROPERTIES"), createDatabaseArgs)
+
+val location = dbLocation.map {
+  case Token("TOK_DATABASELOCATION", Token(loc, Nil) :: Nil) => 
unquoteString(loc)
+}
+val comment = databaseComment.map {
+  case Token("TOK_DATABASECOMMENT", Token(comment, Nil) :: Nil) => 
unquoteString(comment)
+}
+val props: Map[String, String] = dbprops.toSeq.flatMap {
+  case Token("TOK_DATABASEPROPERTIES", propList) =>
+propList.flatMap(extractProps)
+}.toMap
+
+CreateDataBase(databaseName, allowExisting.isDefined, location, 
comment, props)(node.source)
+
+  case Token("TOK_CREATEFUNCTION", func :: as :: createFuncArgs) =>
+val funcName = func.map(x => unquoteString(x.text)).mkString(".")
+val asName = unquoteString(as.text)
+val Seq(
+  rList,
+  temp) = getClauses(Seq(
+  "TOK_RESOURCE_LIST",
+  "TOK_TEMPORARY"), createFuncArgs)
+
+val resourcesMap: Map[String, String] = rList.toSeq.flatMap {
+  case Token("TOK_RESOURCE_LIST", resources) =>
+resources.map {
+  case Token("TOK_RESOURCE_URI", rType :: Token(rPath, Nil) :: 
Nil) =>
+val resourceType = rType match {
+  case Token("TOK_JAR", Nil) => "jar"
+  case Token("TOK_FILE", Nil) => "file"
+  case Token("TOK_ARCHIVE", Nil) => "archive"
+}
+(resourceType, unquoteString(rPath))
+}
+}.toMap
+CreateFunction(funcName, asName, resourcesMap, 
temp.isDefined)(node.source)
+
+  case Token("TOK_ALTERTABLE", alterTableArgs) =>
+AlterTableCommandParser(this).applyOrElse(node,
--- End diff --

The idea of the `PartialFunction` design was that we factor out all Parsing 
functionality into a bunch of PartialFunctions each taking case of a specific 
parsing area. This is nice because this allows us to compose a parser.

The code here just calls into the AlterTableCommandParser and treats it 
like a regular function. This deviates from the design. There are few options 
here:

- follow the proposed design;
- present a composable alternative and implement that;
- create a function in SparkQl which handles AlterTable commands, drop the 
created infrastructure, and we will split the parsers in a followup PR.

I currently favor the third option.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13233][SQL][WIP] Python Dataset

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/7#issuecomment-186843839
  
**[Test build #51638 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51638/consoleFull)**
 for PR 7 at commit 
[`349b119`](https://github.com/apache/spark/commit/349b119d67f03d1fbd86374555d4c3486841985b).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53568012
  
--- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkQl.scala ---
@@ -16,18 +16,43 @@
  */
 package org.apache.spark.sql.execution
 
+import scala.collection.mutable.ArrayBuffer
+
 import org.apache.spark.sql.{AnalysisException, SaveMode}
 import org.apache.spark.sql.catalyst.{CatalystQl, TableIdentifier}
 import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
+import org.apache.spark.sql.catalyst.expressions.{Ascending, Descending}
 import org.apache.spark.sql.catalyst.parser.{ASTNode, ParserConf, 
SimpleParserConf}
 import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, 
OneRowRelation}
 import org.apache.spark.sql.catalyst.plans.logical
+import org.apache.spark.sql.execution.commands._
 import org.apache.spark.sql.execution.datasources._
 import org.apache.spark.sql.types.StructType
 
 private[sql] class SparkQl(conf: ParserConf = SimpleParserConf()) extends 
CatalystQl(conf) {
   /** Check if a command should not be explained. */
-  protected def isNoExplainCommand(command: String): Boolean = 
"TOK_DESCTABLE" == command
+  protected def isNoExplainCommand(command: String): Boolean =
+"TOK_DESCTABLE" == command || "TOK_ALTERTABLE" == command
+
+  protected def extractProps(node: ASTNode): Seq[(String, String)] = node 
match {
--- End diff --

We could make this more DRYly by passing the Token strings and processing 
function (or just unquote the TOK_TABLEOPTION as well).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [MINOR][DOCS] Fix typos in `NOTICE`, `configur...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11289#issuecomment-186842357
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [MINOR][DOCS] Fix typos in `NOTICE`, `configur...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11289#issuecomment-186842358
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51629/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53567861
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/PlanParser.scala ---
@@ -0,0 +1,53 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst
+
+import org.apache.spark.sql.catalyst.parser.ASTNode
+import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
+import org.apache.spark.sql.catalyst.trees.CurrentOrigin
+
+trait ParserBase {
--- End diff --

`ParserBase` and `BaseParser` are a bit confusing. Maybe call this 
`ParserSupport`?

Most of the functions in here do not rely on any form of state. We could 
also make this an object and import that into the parsers.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [MINOR][DOCS] Fix typos in `NOTICE`, `configur...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11289#issuecomment-186842274
  
**[Test build #51629 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51629/consoleFull)**
 for PR 11289 at commit 
[`a73a9e2`](https://github.com/apache/spark/commit/a73a9e291078d81ebc52a56b34b013007e083176).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-7729][UI]Executor which has been killed...

2016-02-21 Thread lianhuiwang
Github user lianhuiwang commented on a diff in the pull request:

https://github.com/apache/spark/pull/10058#discussion_r53567785
  
--- Diff: core/src/main/scala/org/apache/spark/ui/exec/ExecutorsPage.scala 
---
@@ -159,10 +181,14 @@ private[ui] class ExecutorsPage(
   }
   {
 if (threadDumpEnabled) {
--- End diff --

when threadDump is enable, I think " " is better than Seq.empty 
for dead executor, because active executor and dead executor are in the one 
table. It is better for table alignment.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-6761][SQL] Approximate quantile for Dat...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/6042#issuecomment-186841902
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-7729][UI]Executor which has been killed...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/10058#issuecomment-186841946
  
**[Test build #51636 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51636/consoleFull)**
 for PR 10058 at commit 
[`96950c6`](https://github.com/apache/spark/commit/96950c610a4325bebb9c6c41bb6faaeb00c7fa3e).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-6761][SQL] Approximate quantile for Dat...

2016-02-21 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/6042#issuecomment-186841905
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/51632/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53567755
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/CatalystQl.scala ---
@@ -90,7 +93,7 @@ private[sql] class CatalystQl(val conf: ParserConf = 
SimpleParserConf()) extends
 }
   }
 
-  protected def getClauses(
+  def getClauses(
--- End diff --

Shouldn't we move all `getClauseX` methods into the `ParserBase`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-6761][SQL] Approximate quantile for Dat...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/6042#issuecomment-186841743
  
**[Test build #51632 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51632/consoleFull)**
 for PR 6042 at commit 
[`a36891b`](https://github.com/apache/spark/commit/a36891babc21b1b1ba26854aad10aa9af7c4ab89).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-3650][GraphX] Triangle Count handles re...

2016-02-21 Thread insidedctm
Github user insidedctm commented on the pull request:

https://github.com/apache/spark/pull/11290#issuecomment-186838624
  
@srowen good points, I've updated and pushed changes in line with your 
comments


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13136][SQL] Create a dedicated Broadcas...

2016-02-21 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11083#issuecomment-186838836
  
**[Test build #51637 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/51637/consoleFull)**
 for PR 11083 at commit 
[`c8c175e`](https://github.com/apache/spark/commit/c8c175e91ad2896573a4d6efab9ee13d7f28103c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13139][SQL] Create native DDL commands

2016-02-21 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/11048#discussion_r53567674
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/CatalystQl.scala ---
@@ -30,17 +30,20 @@ import org.apache.spark.sql.types._
 import org.apache.spark.unsafe.types.CalendarInterval
 import org.apache.spark.util.random.RandomSampler
 
+abstract class BaseParser(val conf: ParserConf) extends ParserInterface 
with ParserBase {
--- End diff --

Let's move CatalystQl into the parser package.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



<    1   2   3   4   5   >