gengliangwang opened a new pull request #25581: [SPARK-28495][SQL] Introduce 
ANSI store assignment policy for table insertion
URL: https://github.com/apache/spark/pull/25581
 
 
   
   ### What changes were proposed in this pull request?
    Introduce ANSI store assignment policy for table insertion. 
   With ANSI policy, Spark performs the type coercion of table insertion as per 
ANSI SQL.
   
   ### Why are the changes needed?
   In Spark version 2.4 and earlier, when inserting into a table, Spark will 
cast the data type of input query to the data type of target table by coercion. 
This can be super confusing, e.g. users make a mistake and write string values 
to an int column.
   
   In data source V2, by default, only upcasting is allowed when inserting data 
into a table. E.g. int -> long and int -> string are allowed, while decimal -> 
double or long -> int are not allowed. The rules of UpCast was originally 
created for Dataset type coercion. They are quite strict and different from the 
behavior of all existing popular DBMS. This is breaking change. It is possible 
that existing queries are broken after 3.0 releases.
   
   Following ANSI SQL standard makes Spark consistent with the table insertion 
behaviors of popular DBMS like PostgreSQL/Oracle/Mysql.
   
   
   ### Does this PR introduce any user-facing change?
   A new optional mode for table insertion.
   
   
   ### How was this patch tested?
   Unit test
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to