Teng Qiu created SPARK-12014: -------------------------------- Summary: Spark SQL query containing semicolon is broken in Beeline (related to HIVE-11100) Key: SPARK-12014 URL: https://issues.apache.org/jira/browse/SPARK-12014 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 1.5.2 Reporter: Teng Qiu Priority: Minor
Actually it is known hive issue: https://issues.apache.org/jira/browse/HIVE-11100 patch available: https://reviews.apache.org/r/35907/diff/1 but Spark uses its own hive maven dependencies for hive (org.spark-project.hive), we can not use this patch to fix the problem, it would be better if you can fix this in spark's hive package. In spark's beeline, the error message will be: {code} 0: jdbc:hive2://host:10000/> CREATE TABLE beeline_tb (c1 int, c2 string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n'; Error: org.apache.spark.sql.AnalysisException: mismatched input '<EOF>' expecting StringLiteral near 'BY' in table row format's field separator; line 1 pos 87 (state=,code=0) 0: jdbc:hive2://host:10000/> CREATE TABLE beeline_tb (c1 int, c2 string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\;' LINES TERMINATED BY '\n'; Error: org.apache.spark.sql.AnalysisException: mismatched input '<EOF>' expecting StringLiteral near 'BY' in table row format's field separator; line 1 pos 88 (state=,code=0) 0: jdbc:hive2://host:10000/> SELECT str_to_map(other_data,';','=')['u2'] FROM some_logs WHERE log_date = '20151125' limit 5; Error: org.apache.spark.sql.AnalysisException: cannot recognize input near '<EOF>' '<EOF>' '<EOF>' in select expression; line 1 pos 30 (state=,code=0) 0: jdbc:hive2://host:10000/> SELECT str_to_map(other_data,'\;','=')['u2'] FROM some_logs WHERE log_date = '20151125' limit 5; Error: org.apache.spark.sql.AnalysisException: cannot recognize input near '<EOF>' '<EOF>' '<EOF>' in select expression; line 1 pos 31 (state=,code=0) {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org