cloud-fan commented on a change in pull request #31769:
URL: https://github.com/apache/spark/pull/31769#discussion_r589949658



##########
File path: docs/sql-migration-guide.md
##########
@@ -66,6 +66,8 @@ license: |
   - In Spark 3.2, the output schema of `SHOW TBLPROPERTIES` becomes `key: 
string, value: string` whether you specify the table property key or not. In 
Spark 3.1 and earlier, the output schema of `SHOW TBLPROPERTIES` is `value: 
string` when you specify the table property key. To restore the old schema with 
the builtin catalog, you can set `spark.sql.legacy.keepCommandOutputSchema` to 
`true`.
 
   - In Spark 3.2, we support typed literals in the partition spec of INSERT 
and ADD/DROP/RENAME PARTITION. For example, `ADD PARTITION(dt = 
date'2020-01-01')` adds a partition with date value `2020-01-01`. In Spark 3.1 
and earlier, the partition value will be parsed as string value `date 
'2020-01-01'`, which is an illegal date value, and we add a partition with null 
value at the end.
+      
+  - In Spark 3.2, `DataFrameNaFunctions.replace()` no longer uses exact string 
match for the input column names. Input column name having a dot in the name 
(not nested) needs to be escaped with backtick \`. Now, it throws 
`AnalysisException` if the column is not found in the data frame schema. It 
also throws `IllegalArgumentException` if the input column name is a nested 
column. In Spark 3.1 and earlier, it used to ignore invalid input column name 
and nested column name.

Review comment:
       looks good, maybe also explain a little bit why we need to make this 
change:
   ```
   ... no longer uses exact string match for the input column names, to match 
the
   SQL syntax and support qualified column names. ...
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to