[jira] [Comment Edited] (SPARK-3940) SQL console prints error messages three times

2014-10-21 Thread XiaoJing wang (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-3940?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14178108#comment-14178108
 ] 

XiaoJing wang edited comment on SPARK-3940 at 10/21/14 8:05 AM:


Hi [~pwendell] ,thank for your reminder,i have corrected.


was (Author: wangxj8):
Hi @Patrick Wendell ,thank for your reminder,i have corrected.

> SQL console prints error messages three times
> -
>
> Key: SPARK-3940
> URL: https://issues.apache.org/jira/browse/SPARK-3940
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.1.0
>Reporter: XiaoJing wang
>Assignee: XiaoJing wang
>  Labels: patch
> Fix For: 1.2.0
>
>   Original Estimate: 0.05h
>  Remaining Estimate: 0.05h
>
> if an error of SQL,the console print Error three times。
> eg:
> {noformat}
> spark-sql> show tablesss;
> show tablesss;
> 14/10/13 20:56:29 INFO ParseDriver: Parsing command: show tablesss
> NoViableAltException(26@[598:1: ddlStatement : ( createDatabaseStatement | 
> switchDatabaseStatement | dropDatabaseStatement | createTableStatement | 
> dropTableStatement | truncateTableStatement | alterStatement | descStatement 
> | showStatement | metastoreCheck | createViewStatement | dropViewStatement | 
> createFunctionStatement | createMacroStatement | createIndexStatement | 
> dropIndexStatement | dropFunctionStatement | dropMacroStatement | 
> analyzeStatement | lockStatement | unlockStatement | createRoleStatement | 
> dropRoleStatement | grantPrivileges | revokePrivileges | showGrants | 
> showRoleGrants | grantRole | revokeRole );])
>   at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
>   at org.antlr.runtime.DFA.predict(DFA.java:144)
>   at 
> org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:1962)
>   at 
> org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1298)
>   at 
> org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:938)
>   at 
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:190)
>   at 
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:161)
>   at org.apache.spark.sql.hive.HiveQl$.getAst(HiveQl.scala:218)
>   at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:226)
>   at 
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:50)
>   at 
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:49)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>   at 
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(SparkSQLParser.scala:31)
>   at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:130)
>   at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:130)
>   at 
> org.apache.spark.sql.catalyst.SparkSQLParser$$anonfun$org$apache$spark$sql$catalyst$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:184)
>   at 
> org.apache.spark.sql.catalyst.SparkSQLParser$$anonfun$org$apache$spark$sql$catalyst$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:183)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>   a

[jira] [Comment Edited] (SPARK-3940) SQL console prints error messages three times

2014-10-21 Thread XiaoJing wang (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-3940?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14178108#comment-14178108
 ] 

XiaoJing wang edited comment on SPARK-3940 at 10/21/14 8:04 AM:


Hi @Patrick Wendell ,thank for your reminder,i have corrected.


was (Author: wangxj8):
Hi Patrick Wendell ,thank for your reminder,i have corrected.

> SQL console prints error messages three times
> -
>
> Key: SPARK-3940
> URL: https://issues.apache.org/jira/browse/SPARK-3940
> Project: Spark
>  Issue Type: Bug
>  Components: SQL
>Affects Versions: 1.1.0
>Reporter: XiaoJing wang
>Assignee: XiaoJing wang
>  Labels: patch
> Fix For: 1.2.0
>
>   Original Estimate: 0.05h
>  Remaining Estimate: 0.05h
>
> if an error of SQL,the console print Error three times。
> eg:
> {noformat}
> spark-sql> show tablesss;
> show tablesss;
> 14/10/13 20:56:29 INFO ParseDriver: Parsing command: show tablesss
> NoViableAltException(26@[598:1: ddlStatement : ( createDatabaseStatement | 
> switchDatabaseStatement | dropDatabaseStatement | createTableStatement | 
> dropTableStatement | truncateTableStatement | alterStatement | descStatement 
> | showStatement | metastoreCheck | createViewStatement | dropViewStatement | 
> createFunctionStatement | createMacroStatement | createIndexStatement | 
> dropIndexStatement | dropFunctionStatement | dropMacroStatement | 
> analyzeStatement | lockStatement | unlockStatement | createRoleStatement | 
> dropRoleStatement | grantPrivileges | revokePrivileges | showGrants | 
> showRoleGrants | grantRole | revokeRole );])
>   at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
>   at org.antlr.runtime.DFA.predict(DFA.java:144)
>   at 
> org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:1962)
>   at 
> org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1298)
>   at 
> org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:938)
>   at 
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:190)
>   at 
> org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:161)
>   at org.apache.spark.sql.hive.HiveQl$.getAst(HiveQl.scala:218)
>   at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:226)
>   at 
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:50)
>   at 
> org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:49)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
>   at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>   at 
> scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
>   at 
> scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
>   at 
> org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(SparkSQLParser.scala:31)
>   at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:130)
>   at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:130)
>   at 
> org.apache.spark.sql.catalyst.SparkSQLParser$$anonfun$org$apache$spark$sql$catalyst$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:184)
>   at 
> org.apache.spark.sql.catalyst.SparkSQLParser$$anonfun$org$apache$spark$sql$catalyst$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:183)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
>   at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
>