Hi, It seems that SparkSQL, even the HiveContext, does not support SQL statements like : SELECT category, count(1) AS cnt FROM products GROUP BY category HAVING cnt > 10;
I get this exception: Error: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: Unresolved attributes: CAST(('cnt < 2), BooleanType), tree: I couldn't find anywhere is documentation whether "having" keyword is not supported ? If this is the case, what would be the work around? using two nested select statements? best, /Shahab