Zhenhua Wang created SPARK-11398: ------------------------------------ Summary: misleading dialect conf at the start of spark-sql Key: SPARK-11398 URL: https://issues.apache.org/jira/browse/SPARK-11398 Project: Spark Issue Type: Bug Components: SQL Reporter: Zhenhua Wang Priority: Minor
When we start bin/spark-sql, the default context is HiveContext, and the corresponding dialect is hiveql. However, if we type "set spark.sql.dialect;", the result is "sql", which is inconsistent with the actual dialect and is misleading. For example, we can create tables which is only allowed in hiveql, but this dialect conf shows it's "sql". Although This problem will not cause any execution error, it's misleading to spark sql users. Therefore I think we should fix it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org