[ 
https://issues.apache.org/jira/browse/SPARK-5327?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14283190#comment-14283190
 ] 

Cheng Lian commented on SPARK-5327:
-----------------------------------

Should add dedicated Jenkins builder for Hive 0.12.0.

> HiveCompatibilitySuite fails when executed against Hive 0.12.0
> --------------------------------------------------------------
>
>                 Key: SPARK-5327
>                 URL: https://issues.apache.org/jira/browse/SPARK-5327
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0, 1.2.1
>            Reporter: Cheng Lian
>
> Git commit: e7884bc950a374408959b6118efe2c62fbe50608
> Run the following SBT session to reproduce:
> {code}
> $ ./build/sbt -Pyarn,hive,hive-thriftserver,hive-0.12.0,hadoop-2.4,scala-2.10 
> -Dhadoop.version=2.4.1
> ...
> > hive/test-only *.HiveCompatibilitySuite -- -z create_view_translate
> ...
> [info] - create_view_translate *** FAILED *** (9 seconds, 216 milliseconds)
> [info]   Failed to execute query using catalyst:
> [info]   Error: Failed to parse: SELECT `items`.`id`, 
> items`items`.`info`info['price'] FROM `default`.`items`
> [info]   org.apache.spark.sql.hive.HiveQl$ParseException: Failed to parse: 
> SELECT `items`.`id`, items`items`.`info`info['price'] FROM `default`.`items`
> [info]          at 
> org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:249)
> [info]          at 
> org.apache.spark.sql.hive.HiveQl$.createPlanForView(HiveQl.scala:275)
> [info]          at 
> org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:151)
> ...
> {code}
> Seems that something went wrong when dealing with nested fields. Hive 0.13.1 
> is OK.
> There are some other test cases also fail when executed against Hive 0.12.0. 
> Will list them later.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to