YuanGuanhu created SPARK-35359: ---------------------------------- Summary: when insert into char/varchar column exceed length limitation will fail Key: SPARK-35359 URL: https://issues.apache.org/jira/browse/SPARK-35359 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 3.1.1 Reporter: YuanGuanhu
In Spark3.1.1 have support Char/Varchar type, but when we insert data to char/varchar type will fail if data length exceed length limitation. we should have an configuration to compatible with versions earlier than Spark3.1 -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org