[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-18 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/12246


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-18 Thread zsxwing
Github user zsxwing commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-211506802
  
Thanks, merging into master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210697628
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55976/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210697627
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210697523
  
**[Test build #55976 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55976/consoleFull)**
 for PR 12246 at commit 
[`8a71835`](https://github.com/apache/spark/commit/8a71835f4011a3570990669346a65dcea51adb4f).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210678035
  
**[Test build #55976 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55976/consoleFull)**
 for PR 12246 at commit 
[`8a71835`](https://github.com/apache/spark/commit/8a71835f4011a3570990669346a65dcea51adb4f).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210677864
  
retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210677909
  
Looks `MemorySinkSuite` is still flakey...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210660407
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210660409
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55956/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210660210
  
**[Test build #55956 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55956/consoleFull)**
 for PR 12246 at commit 
[`8a71835`](https://github.com/apache/spark/commit/8a71835f4011a3570990669346a65dcea51adb4f).
 * This patch **fails Spark unit tests**.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread marmbrus
Github user marmbrus commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59942192
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
--- End diff --

hmmm is cogrouping supported at all?  typically the two iterators should 
have all the data for a group.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spar

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210635885
  
**[Test build #55956 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55956/consoleFull)**
 for PR 12246 at commit 
[`8a71835`](https://github.com/apache/spark/commit/8a71835f4011a3570990669346a65dcea51adb4f).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210634397
  
LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59936839
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
+
+case u: Union if u.children.count(_.isStreaming) == 1 =>
--- End diff --

Correction. I didnt realize that union can have multiple children, not just 
two. Fixed the bug.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

--

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59936215
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,378 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.{Attribute, 
AttributeReference}
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val attribute = AttributeReference("a", IntegerType, nullable = true)()
+  val batchRelation = LocalRelation(attribute)
+  val streamRelation = new TestStreamingRelation(attribute)
+
+  /*
+
===
+ BATCH QUERIES
+
===
+   */
+
+  assertSupportedForBatch("local relation", batchRelation)
+
+  assertNotSupportedForBatch(
+"streaming source",
+streamRelation,
+Seq("with streaming source", "startStream"))
+
+  assertNotSupportedForBatch(
+"select on streaming source",
+streamRelation.select($"count(*)"),
+Seq("with streaming source", "startStream"))
+
+
+  /*
+
===
+ STREAMING QUERIES
+
===
+   */
+
+  // Batch plan in streaming query
+  testError("batch source", Seq("without streaming source", 
"startStream")) {
+
UnsupportedOperationChecker.checkForStreaming(batchRelation.select($"count(*)"),
 Append)
+  }
+
+  // Commands
+  assertNotSupportedForStreaming(
+"commmands",
+DescribeFunction("func", true),
+outputMode = Append,
+expectedMsgs = "commands" :: Nil)
+
+  // Aggregates: Not supported on streams in Append mode
+  assertSupportedForStreaming(
--- End diff --

I am renaming `assertSupportedForStreaming` to 
`assertSupportedInStreamingPlan` to make it unambiguous.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59935847
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,378 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.{Attribute, 
AttributeReference}
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val attribute = AttributeReference("a", IntegerType, nullable = true)()
+  val batchRelation = LocalRelation(attribute)
+  val streamRelation = new TestStreamingRelation(attribute)
+
+  /*
+
===
+ BATCH QUERIES
+
===
+   */
+
+  assertSupportedForBatch("local relation", batchRelation)
+
+  assertNotSupportedForBatch(
+"streaming source",
+streamRelation,
+Seq("with streaming source", "startStream"))
+
+  assertNotSupportedForBatch(
+"select on streaming source",
+streamRelation.select($"count(*)"),
+Seq("with streaming source", "startStream"))
+
+
+  /*
+
===
+ STREAMING QUERIES
+
===
+   */
+
+  // Batch plan in streaming query
+  testError("batch source", Seq("without streaming source", 
"startStream")) {
+
UnsupportedOperationChecker.checkForStreaming(batchRelation.select($"count(*)"),
 Append)
+  }
+
+  // Commands
+  assertNotSupportedForStreaming(
+"commmands",
+DescribeFunction("func", true),
+outputMode = Append,
+expectedMsgs = "commands" :: Nil)
+
+  // Aggregates: Not supported on streams in Append mode
+  assertSupportedForStreaming(
--- End diff --

No no ... its a test aggregate on a batch relation inside a streaming plan.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59935095
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,378 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.{Attribute, 
AttributeReference}
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val attribute = AttributeReference("a", IntegerType, nullable = true)()
+  val batchRelation = LocalRelation(attribute)
+  val streamRelation = new TestStreamingRelation(attribute)
+
+  /*
+
===
+ BATCH QUERIES
+
===
+   */
+
+  assertSupportedForBatch("local relation", batchRelation)
+
+  assertNotSupportedForBatch(
+"streaming source",
+streamRelation,
+Seq("with streaming source", "startStream"))
+
+  assertNotSupportedForBatch(
+"select on streaming source",
+streamRelation.select($"count(*)"),
+Seq("with streaming source", "startStream"))
+
+
+  /*
+
===
+ STREAMING QUERIES
+
===
+   */
+
+  // Batch plan in streaming query
+  testError("batch source", Seq("without streaming source", 
"startStream")) {
+
UnsupportedOperationChecker.checkForStreaming(batchRelation.select($"count(*)"),
 Append)
+  }
+
+  // Commands
+  assertNotSupportedForStreaming(
+"commmands",
+DescribeFunction("func", true),
+outputMode = Append,
+expectedMsgs = "commands" :: Nil)
+
+  // Aggregates: Not supported on streams in Append mode
+  assertSupportedForStreaming(
+"aggregate - stream with update output mode",
+batchRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
+"aggregate - batch with update output mode",
+streamRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
--- End diff --

assertSupportedForBatch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59935088
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,378 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.{Attribute, 
AttributeReference}
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val attribute = AttributeReference("a", IntegerType, nullable = true)()
+  val batchRelation = LocalRelation(attribute)
+  val streamRelation = new TestStreamingRelation(attribute)
+
+  /*
+
===
+ BATCH QUERIES
+
===
+   */
+
+  assertSupportedForBatch("local relation", batchRelation)
+
+  assertNotSupportedForBatch(
+"streaming source",
+streamRelation,
+Seq("with streaming source", "startStream"))
+
+  assertNotSupportedForBatch(
+"select on streaming source",
+streamRelation.select($"count(*)"),
+Seq("with streaming source", "startStream"))
+
+
+  /*
+
===
+ STREAMING QUERIES
+
===
+   */
+
+  // Batch plan in streaming query
+  testError("batch source", Seq("without streaming source", 
"startStream")) {
+
UnsupportedOperationChecker.checkForStreaming(batchRelation.select($"count(*)"),
 Append)
+  }
+
+  // Commands
+  assertNotSupportedForStreaming(
+"commmands",
+DescribeFunction("func", true),
+outputMode = Append,
+expectedMsgs = "commands" :: Nil)
+
+  // Aggregates: Not supported on streams in Append mode
+  assertSupportedForStreaming(
--- End diff --

assertSupportedForBatch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59933665
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
+
+case u: Union if u.children.count(_.isStreaming) == 1 =>
+  throwError("Union between streaming and batch 
DataFrames/Datasets is not supported")
+
+case Except(left, right) if right.isStreaming =>
+  throwError("Except with a streaming DataFrame/Dataset on the 
right is not supported")
+
+case Intersect(left, right) if left.isStreaming && 
right.isStreaming =>
+  throwError("Intersect between two streaming DataFrames/Datasets 
is not supported")
+
+case GroupingSets(_,

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59933260
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
+
+case u: Union if u.children.count(_.isStreaming) == 1 =>
+  throwError("Union between streaming and batch 
DataFrames/Datasets is not supported")
+
+case Except(left, right) if right.isStreaming =>
+  throwError("Except with a streaming DataFrame/Dataset on the 
right is not supported")
+
+case Intersect(left, right) if left.isStreaming && 
right.isStreaming =>
+  throwError("Intersect between two streaming DataFrames/Datasets 
is not supported")
+
+case GroupingSets(_, _,

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59933234
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,378 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.{Attribute, 
AttributeReference}
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val attribute = AttributeReference("a", IntegerType, nullable = true)()
+  val batchRelation = LocalRelation(attribute)
+  val streamRelation = new TestStreamingRelation(attribute)
+
+  /*
+
===
+ BATCH QUERIES
+
===
+   */
+
+  assertSupportedForBatch("local relation", batchRelation)
+
+  assertNotSupportedForBatch(
+"streaming source",
+streamRelation,
+Seq("with streaming source", "startStream"))
+
+  assertNotSupportedForBatch(
+"select on streaming source",
+streamRelation.select($"count(*)"),
+Seq("with streaming source", "startStream"))
+
+
+  /*
+
===
+ STREAMING QUERIES
+
===
+   */
+
+  // Batch plan in streaming query
+  testError("batch source", Seq("without streaming source", 
"startStream")) {
+
UnsupportedOperationChecker.checkForStreaming(batchRelation.select($"count(*)"),
 Append)
+  }
+
+  // Commands
+  assertNotSupportedForStreaming(
+"commmands",
+DescribeFunction("func", true),
+outputMode = Append,
+expectedMsgs = "commands" :: Nil)
+
+  // Aggregates: Not supported on streams in Append mode
+  assertSupportedForStreaming(
+"aggregate - stream with update output mode",
+batchRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
+"aggregate - batch with update output mode",
--- End diff --

nit: batch -> stream


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59933111
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
--- End diff --

I replaced `plan` with `c` to make it less confusing. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932993
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/streaming/StreamSuite.scala ---
@@ -108,6 +107,36 @@ class StreamSuite extends StreamTest with 
SharedSQLContext {
 assertDF(df)
 assertDF(df)
   }
+
+  test("unsupported queries") {
+val c = sqlContext
--- End diff --

nit: redundant line


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932821
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
--- End diff --

I want to highlight the specific sub-tree of the plan that is the culprint. 
Thats why i am doing foreachUp, so that I can give more specific error messages 
to the user.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932855
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
--- End diff --

updated.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932690
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,379 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.AttributeReference
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val batchRelation = LocalRelation(AttributeReference("a", IntegerType, 
nullable = true)())
+
+  val streamRelation = new LocalRelation(
+Seq(AttributeReference("a", IntegerType, nullable = true)())) {
+override def isStreaming: Boolean = true
+  }
+
+  /*
+
===
+ BATCH QUERIES
+
===
+   */
+
+  assertSupportedForBatch("local relation", batchRelation)
+
+  assertNotSupportedForBatch(
+"streaming source",
+streamRelation,
+Seq("with streaming source", "startStream"))
+
+  assertNotSupportedForBatch(
+"select on streaming source",
+streamRelation.select($"count(*)"),
+Seq("with streaming source", "startStream"))
+
+
+  /*
+
===
+ STREAMING QUERIES
+
===
+   */
+
+  // Batch plan in streaming query
+  testError("batch source", Seq("without streaming source", 
"startStream")) {
+
UnsupportedOperationChecker.checkForStreaming(batchRelation.select($"count(*)"),
 Append)
+  }
+
+  // Commands
+  assertNotSupportedForStreaming(
+"commmands",
+DescribeFunction("func", true),
+outputMode = Append,
+expectedMsgs = "commands" :: Nil)
+
+  // Aggregates: Not supported on streams in Append mode
+  assertSupportedForStreaming(
+"aggregate - stream with update output mode",
+batchRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
+"aggregate - batch with update output mode",
+streamRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
+"aggregate - batch with append output mode",
+batchRelation.groupBy("a")("count(*)"),
+outputMode = Append)
+
+  assertNotSupportedForStreaming(
+"aggregate - stream with append output mode",
+streamRelation.groupBy("a")("count(*)"),
+outputMode = Append,
+Seq("aggregation", "append output mode"))
+
+  // Inner joins: Stream-stream not supported
+  testBinaryOperationForStreaming(
+"inner join",
+_.join(_, joinType = Inner),
+streamStreamSupported = false)
+
+  // Full outer joins: only batch-batch is allowed
+  testBinaryOperationForStreaming(
+"full outer join",
+_.join(_, joinType = FullOuter),
+streamStreamSupported = false,
+batchStreamSupported = false,
+streamBatchSupported = false)
+
+  // Left outer joins: *-stream not allowed
+  testBinaryOperationForStreaming(
+"left outer join",
+_.join(_, joinType = LeftOuter),
+streamStreamSupported = fals

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932564
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
+
+case u: Union if u.children.count(_.isStreaming) == 1 =>
--- End diff --

No ... union between two batch relations is fine, union between two 
streaming relation is also fine. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932379
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
+
+case u: Union if u.children.count(_.isStreaming) == 1 =>
+  throwError("Union between streaming and batch 
DataFrames/Datasets is not supported")
+
+case Except(left, right) if right.isStreaming =>
+  throwError("Except with a streaming DataFrame/Dataset on the 
right is not supported")
+
+case Intersect(left, right) if left.isStreaming && 
right.isStreaming =>
+  throwError("Intersect between two streaming DataFrames/Datasets 
is not supported")
+
+case GroupingSets(_,

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932341
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
+
+case u: Union if u.children.count(_.isStreaming) == 1 =>
+  throwError("Union between streaming and batch 
DataFrames/Datasets is not supported")
+
+case Except(left, right) if right.isStreaming =>
+  throwError("Except with a streaming DataFrame/Dataset on the 
right is not supported")
+
+case Intersect(left, right) if left.isStreaming && 
right.isStreaming =>
+  throwError("Intersect between two streaming DataFrames/Datasets 
is not supported")
+
+case GroupingSets(_,

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932237
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
--- End diff --

nit: `if plan.children.exists(_.isStreaming)` can be removed since `plan. 
isStreaming` is true.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59932041
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
+
+case u: Union if u.children.count(_.isStreaming) == 1 =>
--- End diff --

should be `!u.children.forall(_.isStreaming)`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59931914
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  if (left.isStreaming && right.isStreaming) {
+throwError("Inner join between two streaming 
DataFrames/Datasets is not supported")
+  }
+
+case FullOuter =>
+  if (left.isStreaming || right.isStreaming) {
+throwError("Full outer joins with streaming 
DataFrames/Datasets are not supported")
+  }
+
+
+case LeftOuter | LeftSemi | LeftAnti =>
+  if (right.isStreaming) {
+throwError("Left outer/semi/anti joins with a streaming 
DataFrame/Dataset " +
+"on the right is not supported")
+  }
+
+case RightOuter =>
+  if (left.isStreaming) {
+throwError("Right outer join with a streaming 
DataFrame/Dataset on the left is " +
+"not supported")
+  }
+
+case NaturalJoin(_) | UsingJoin(_, _) =>
+  // They should not appear in an analyzed plan.
+
+case _ =>
+  throwError(s"Join type $joinType is not supported with 
streaming DataFrame/Dataset")
+  }
+
+case c: CoGroup if plan.children.exists(_.isStreaming) =>
+  throwError("CoGrouping between two streaming DataFrames/Datasets 
is not supported")
--- End diff --

nit: CoGroup is not supported on streaming DataFrames/Datasets


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mai

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread zsxwing
Github user zsxwing commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59931552
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,145 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
--- End diff --

nit: Just call `plan.isStreaming` is enough.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210601612
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210601615
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55939/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210600906
  
**[Test build #55939 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55939/consoleFull)**
 for PR 12246 at commit 
[`a5d180e`](https://github.com/apache/spark/commit/a5d180e8772d203ba9354ab33ec438b545daf329).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread tdas
Github user tdas commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59912290
  
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/StreamTest.scala ---
@@ -75,6 +76,8 @@ trait StreamTest extends QueryTest with Timeouts {
   /** How long to wait for an active stream to catch up when checking a 
result. */
   val streamingTimeout = 10.seconds
 
+  val outputMode: OutputMode = Append
--- End diff --

Can add this later when we have more support for more output modes and 
mixed suites


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-15 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210560723
  
**[Test build #55939 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55939/consoleFull)**
 for PR 12246 at commit 
[`a5d180e`](https://github.com/apache/spark/commit/a5d180e8772d203ba9354ab33ec438b545daf329).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210287120
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55890/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210287115
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210287017
  
**[Test build #55890 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55890/consoleFull)**
 for PR 12246 at commit 
[`3f1353e`](https://github.com/apache/spark/commit/3f1353e96e94f11c431c1d1f75f9782731fafc53).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210269363
  
**[Test build #55890 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55890/consoleFull)**
 for PR 12246 at commit 
[`3f1353e`](https://github.com/apache/spark/commit/3f1353e96e94f11c431c1d1f75f9782731fafc53).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread tdas
Github user tdas commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210269322
  
There was a critical bug that made it not test the non-supported cases. 
Fixed it and updated the PR. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread marmbrus
Github user marmbrus commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-210081598
  
Some minor comments, otherwise LGTM.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread marmbrus
Github user marmbrus commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59764187
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,379 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.AttributeReference
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val batchRelation = LocalRelation(AttributeReference("a", IntegerType, 
nullable = true)())
+
+  val streamRelation = new LocalRelation(
+Seq(AttributeReference("a", IntegerType, nullable = true)())) {
+override def isStreaming: Boolean = true
+  }
+
+  /*
+
===
+ BATCH QUERIES
+
===
+   */
+
+  assertSupportedForBatch("local relation", batchRelation)
+
+  assertNotSupportedForBatch(
+"streaming source",
+streamRelation,
+Seq("with streaming source", "startStream"))
+
+  assertNotSupportedForBatch(
+"select on streaming source",
+streamRelation.select($"count(*)"),
+Seq("with streaming source", "startStream"))
+
+
+  /*
+
===
+ STREAMING QUERIES
+
===
+   */
+
+  // Batch plan in streaming query
+  testError("batch source", Seq("without streaming source", 
"startStream")) {
+
UnsupportedOperationChecker.checkForStreaming(batchRelation.select($"count(*)"),
 Append)
+  }
+
+  // Commands
+  assertNotSupportedForStreaming(
+"commmands",
+DescribeFunction("func", true),
+outputMode = Append,
+expectedMsgs = "commands" :: Nil)
+
+  // Aggregates: Not supported on streams in Append mode
+  assertSupportedForStreaming(
+"aggregate - stream with update output mode",
+batchRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
+"aggregate - batch with update output mode",
+streamRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
+"aggregate - batch with append output mode",
+batchRelation.groupBy("a")("count(*)"),
+outputMode = Append)
+
+  assertNotSupportedForStreaming(
+"aggregate - stream with append output mode",
+streamRelation.groupBy("a")("count(*)"),
+outputMode = Append,
+Seq("aggregation", "append output mode"))
+
+  // Inner joins: Stream-stream not supported
+  testBinaryOperationForStreaming(
+"inner join",
+_.join(_, joinType = Inner),
+streamStreamSupported = false)
+
+  // Full outer joins: only batch-batch is allowed
+  testBinaryOperationForStreaming(
+"full outer join",
+_.join(_, joinType = FullOuter),
+streamStreamSupported = false,
+batchStreamSupported = false,
+streamBatchSupported = false)
+
+  // Left outer joins: *-stream not allowed
+  testBinaryOperationForStreaming(
+"left outer join",
+_.join(_, joinType = LeftOuter),
+streamStreamSupported = 

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread marmbrus
Github user marmbrus commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59763939
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,379 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.AttributeReference
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val batchRelation = LocalRelation(AttributeReference("a", IntegerType, 
nullable = true)())
+
+  val streamRelation = new LocalRelation(
+Seq(AttributeReference("a", IntegerType, nullable = true)())) {
+override def isStreaming: Boolean = true
+  }
+
+  /*
+
===
+ BATCH QUERIES
+
===
+   */
+
+  assertSupportedForBatch("local relation", batchRelation)
+
+  assertNotSupportedForBatch(
+"streaming source",
+streamRelation,
+Seq("with streaming source", "startStream"))
+
+  assertNotSupportedForBatch(
+"select on streaming source",
+streamRelation.select($"count(*)"),
+Seq("with streaming source", "startStream"))
+
+
+  /*
+
===
+ STREAMING QUERIES
+
===
+   */
+
+  // Batch plan in streaming query
+  testError("batch source", Seq("without streaming source", 
"startStream")) {
+
UnsupportedOperationChecker.checkForStreaming(batchRelation.select($"count(*)"),
 Append)
+  }
+
+  // Commands
+  assertNotSupportedForStreaming(
+"commmands",
+DescribeFunction("func", true),
+outputMode = Append,
+expectedMsgs = "commands" :: Nil)
+
+  // Aggregates: Not supported on streams in Append mode
+  assertSupportedForStreaming(
+"aggregate - stream with update output mode",
+batchRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
+"aggregate - batch with update output mode",
+streamRelation.groupBy("a")("count(*)"),
+outputMode = Update)
+
+  assertSupportedForStreaming(
+"aggregate - batch with append output mode",
+batchRelation.groupBy("a")("count(*)"),
+outputMode = Append)
+
+  assertNotSupportedForStreaming(
+"aggregate - stream with append output mode",
+streamRelation.groupBy("a")("count(*)"),
+outputMode = Append,
+Seq("aggregation", "append output mode"))
+
+  // Inner joins: Stream-stream not supported
+  testBinaryOperationForStreaming(
+"inner join",
+_.join(_, joinType = Inner),
+streamStreamSupported = false)
+
+  // Full outer joins: only batch-batch is allowed
+  testBinaryOperationForStreaming(
+"full outer join",
+_.join(_, joinType = FullOuter),
+streamStreamSupported = false,
+batchStreamSupported = false,
+streamBatchSupported = false)
+
+  // Left outer joins: *-stream not allowed
+  testBinaryOperationForStreaming(
+"left outer join",
+_.join(_, joinType = LeftOuter),
+streamStreamSupported = 

[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread marmbrus
Github user marmbrus commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59763849
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationsSuite.scala
 ---
@@ -0,0 +1,379 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.dsl.expressions._
+import org.apache.spark.sql.catalyst.dsl.plans._
+import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
+import org.apache.spark.sql.catalyst.expressions.AttributeReference
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+import org.apache.spark.sql.types.IntegerType
+
+class UnsupportedOperationsSuite extends SparkFunSuite {
+
+  val batchRelation = LocalRelation(AttributeReference("a", IntegerType, 
nullable = true)())
+
+  val streamRelation = new LocalRelation(
--- End diff --

Extending case classes is not really supported / might be deprecated.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread marmbrus
Github user marmbrus commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59763010
  
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/StreamTest.scala ---
@@ -75,6 +76,8 @@ trait StreamTest extends QueryTest with Timeouts {
   /** How long to wait for an active stream to catch up when checking a 
result. */
   val streamingTimeout = 10.seconds
 
+  val outputMode: OutputMode = Append
--- End diff --

We might want to make this an argument to `streamTest` with a default.  I 
think in a single suite we might want to be able to have more than one kind of 
test.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread marmbrus
Github user marmbrus commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59762679
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/LogicalPlan.scala
 ---
@@ -42,6 +42,9 @@ abstract class LogicalPlan extends QueryPlan[LogicalPlan] 
with Logging {
*/
   def analyzed: Boolean = _analyzed
 
+  /** Whether this logical plan requires incremental execution */
--- End diff --

Returns true if this subtree contains any streaming data sources.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread marmbrus
Github user marmbrus commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59762314
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/UnsupportedOperationChecker.scala
 ---
@@ -0,0 +1,143 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.plans._
+import org.apache.spark.sql.catalyst.plans.logical._
+
+/**
+ * Analyzes the presence of unsupported operations in a logical plan.
+ */
+object UnsupportedOperationChecker {
+
+  def checkForBatch(plan: LogicalPlan): Unit = {
+plan.foreachUp {
+  case p if p.isStreaming =>
+throwError(
+  "Queries with streaming sources must be executed with 
write.startStream()")(p)
+
+  case _ =>
+}
+  }
+
+  def checkForStreaming(plan: LogicalPlan, outputMode: OutputMode): Unit = 
{
+
+if (!plan.isStreaming) {
+  throwError(
+"Queries without streaming sources cannot be executed with 
write.startStream()")(plan)
+}
+
+plan.foreachUp { implicit plan =>
+
+  // Operations that cannot exists anywhere in a streaming plan
+  plan match {
+
+case _: Command =>
+  throwError("Commands like CreateTable*, AlterTable*, Show* are 
not supported with " +
+"streaming DataFrames/Datasets")
+
+case _: InsertIntoTable =>
+  throwError("InsertIntoTable is not supported with streaming 
DataFrames/Datasets")
+
+case Aggregate(_, _, child) if child.isStreaming && outputMode == 
Append =>
+  throwError(
+"Aggregations are not supported on streaming 
DataFrames/Datasets in " +
+  "Append output mode. Consider changing output mode to 
Update.")
+
+case Join(left, right, joinType, _) =>
+
+  joinType match {
+
+case Inner =>
+  throwErrorIf(
--- End diff --

Half of this file is putting the `if` as `


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-14 Thread marmbrus
Github user marmbrus commented on a diff in the pull request:

https://github.com/apache/spark/pull/12246#discussion_r59762181
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/OutputMode.scala
 ---
@@ -0,0 +1,23 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+trait OutputMode
--- End diff --

`sealed`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-12 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-208752572
  
Test PASSed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55590/
Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-12 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-208752569
  
Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-12 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-208752097
  
**[Test build #55590 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55590/consoleFull)**
 for PR 12246 at commit 
[`6aa5554`](https://github.com/apache/spark/commit/6aa5554d4969c4c3b061194c3fe6d685d8912b92).
 * This patch passes all tests.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-11 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-208720591
  
**[Test build #55590 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55590/consoleFull)**
 for PR 12246 at commit 
[`6aa5554`](https://github.com/apache/spark/commit/6aa5554d4969c4c3b061194c3fe6d685d8912b92).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-11 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-208662560
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-11 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-208662558
  
**[Test build #55572 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55572/consoleFull)**
 for PR 12246 at commit 
[`64ed243`](https://github.com/apache/spark/commit/64ed24347de349e8e734ad50c11392eb17237d4e).
 * This patch **fails Scala style tests**.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-14473][SQL] Define analysis rules to ca...

2016-04-11 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/12246#issuecomment-208662563
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55572/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org