This is not supported yet. It would be great if you could open a JIRA (though I think apache JIRA is down ATM).
On Tue, Nov 4, 2014 at 9:40 AM, Terry Siu <terry....@smartfocus.com> wrote: > I’m trying to execute a subquery inside an IN clause and am encountering > an unsupported language feature in the parser. > > java.lang.RuntimeException: Unsupported language features in query: > select customerid from sparkbug where customerid in (select customerid from > sparkbug where customerid in (2,3)) > > TOK_QUERY > > TOK_FROM > > TOK_TABREF > > TOK_TABNAME > > sparkbug > > TOK_INSERT > > TOK_DESTINATION > > TOK_DIR > > TOK_TMP_FILE > > TOK_SELECT > > TOK_SELEXPR > > TOK_TABLE_OR_COL > > customerid > > TOK_WHERE > > TOK_SUBQUERY_EXPR > > TOK_SUBQUERY_OP > > in > > TOK_QUERY > > TOK_FROM > > TOK_TABREF > > TOK_TABNAME > > sparkbug > > TOK_INSERT > > TOK_DESTINATION > > TOK_DIR > > TOK_TMP_FILE > > TOK_SELECT > > TOK_SELEXPR > > TOK_TABLE_OR_COL > > customerid > > TOK_WHERE > > TOK_FUNCTION > > in > > TOK_TABLE_OR_COL > > customerid > > 2 > > 3 > > TOK_TABLE_OR_COL > > customerid > > > scala.NotImplementedError: No parse rules for ASTNode type: 817, text: > TOK_SUBQUERY_EXPR : > > TOK_SUBQUERY_EXPR > > TOK_SUBQUERY_OP > > in > > TOK_QUERY > > TOK_FROM > > TOK_TABREF > > TOK_TABNAME > > sparkbug > > TOK_INSERT > > TOK_DESTINATION > > TOK_DIR > > TOK_TMP_FILE > > TOK_SELECT > > TOK_SELEXPR > > TOK_TABLE_OR_COL > > customerid > > TOK_WHERE > > TOK_FUNCTION > > in > > TOK_TABLE_OR_COL > > customerid > > 2 > > 3 > > TOK_TABLE_OR_COL > > customerid > > " + > > > > org.apache.spark.sql.hive.HiveQl$.nodeToExpr(HiveQl.scala:1098) > > > > at scala.sys.package$.error(package.scala:27) > > at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:252) > > at > org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:50) > > at > org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:49) > > at > scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136) > > Are subqueries in predicates just not supported in 1.2? I think I’m > seeing the same issue as: > > > http://apache-spark-user-list.1001560.n3.nabble.com/Subquery-in-having-clause-Spark-1-1-0-td17401.html > > Thanks, > -Terry > >