Hello, I am using Spark 1.2.1 along with Hive 0.13.1. I run some hive queries by using beeline and Thriftserver. Queries I tested so far worked well except the followings: I want to export the query output into a file at either HDFS or local fs (ideally local fs). There are not yet supported? The spark github has already unit tests using "insert overwrite directory" in https://github.com/apache/spark/blob/master/sql/hive/src/test/resources/ql/src/test/queries/clientpositive/insert_overwrite_local_directory_1.q.
$insert overwrite directory '<hdfs directory name>' select * from temptable; TOK_QUERY TOK_FROM TOK_TABREF TOK_TABNAME temptable TOK_INSERT TOK_DESTINATION TOK_DIR '/user/ogoh/table' TOK_SELECT TOK_SELEXPR TOK_ALLCOLREF scala.NotImplementedError: No parse rules for: TOK_DESTINATION TOK_DIR '/user/bob/table' $insert overwrite local directory '<hdfs directory name>' select * from temptable; ; TOK_QUERY TOK_FROM TOK_TABREF TOK_TABNAME temptable TOK_INSERT TOK_DESTINATION TOK_LOCAL_DIR "/user/bob/table" TOK_SELECT TOK_SELEXPR TOK_ALLCOLREF scala.NotImplementedError: No parse rules for: TOK_DESTINATION TOK_LOCAL_DIR "/user/ogoh/table" Thanks, Okehee -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-supports-hive-insert-overwrite-directory-tp21951.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org