[ 
https://issues.apache.org/jira/browse/SPARK-3810?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michael Armbrust resolved SPARK-3810.
-------------------------------------
       Resolution: Fixed
    Fix Version/s: 1.2.0

Issue resolved by pull request 2672
[https://github.com/apache/spark/pull/2672]

> Rule PreInsertionCasts doesn't handle partitioned table properly
> ----------------------------------------------------------------
>
>                 Key: SPARK-3810
>                 URL: https://issues.apache.org/jira/browse/SPARK-3810
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.1.0
>            Reporter: Cheng Lian
>            Priority: Minor
>             Fix For: 1.2.0
>
>
> This issue can be reproduced by the following {{sbt/sbt hive/console}} 
> session:
> {code}
> scala> loadTestTable("src")
> ...
> scala> loadTestTable("srcpart")
> ...
> scala> sql("INSERT INTO TABLE srcpart PARTITION (ds='1', hr='2') SELECT key, 
> value FROM src").queryExecution
> ...
> == Parsed Logical Plan ==
> InsertIntoTable (UnresolvedRelation None, srcpart, None), Map(ds -> 
> Some(hello), hr -> Some(world)), false
>  Project ['key,'value]
>   UnresolvedRelation None, src, None
> == Analyzed Logical Plan ==
> InsertIntoTable (MetastoreRelation default, srcpart, None), Map(ds -> 
> Some(hello), hr -> Some(world)), false
>  Project [key#50,value#51]
>   Project [key#50,value#51]
>    Project [key#50,value#51]
>     Project [key#50,value#51]
>      Project [key#50,value#51]
>       Project [key#50,value#51]
>        Project [key#50,value#51]
>         Project [key#50,value#51]
>          Project [key#50,value#51]
>           Project [key#50,value#51]
>            Project [key#50,value#51]
>             Project [key...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to