cloud-fan commented on code in PR #45612:
URL: https://github.com/apache/spark/pull/45612#discussion_r1533184546


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AnsiTypeCoercion.scala:
##########
@@ -180,56 +180,27 @@ object AnsiTypeCoercion extends TypeCoercionBase {
       // cast the input to decimal.
       case (n: NumericType, DecimalType) => Some(DecimalType.forType(n))
 
-      // Cast null type (usually from null literals) into target types
-      // By default, the result type is `target.defaultConcreteType`. When the 
target type is
-      // `TypeCollection`, there is another branch to find the "closet 
convertible data type" below.
-      case (NullType, target) if !target.isInstanceOf[TypeCollection] =>
-        Some(target.defaultConcreteType)
-
       // If a function expects a StringType, no StringType instance should be 
implicitly cast to
       // StringType with a collation that's not accepted (aka. lockdown 
unsupported collations).
       case (_: StringType, StringType) => None
       case (_: StringType, _: StringTypeCollated) => None
 
-      // This type coercion system will allow implicit converting String type 
as other
-      // primitive types, in case of breaking too many existing Spark SQL 
queries.
-      case (StringType, a: AtomicType) =>
-        Some(a)
-
-      // If the target type is any Numeric type, convert the String type as 
Double type.
-      case (StringType, NumericType) =>
-        Some(DoubleType)
-
-      // If the target type is any Decimal type, convert the String type as 
the default
-      // Decimal type.
-      case (StringType, DecimalType) =>
-        Some(DecimalType.SYSTEM_DEFAULT)
-
-      // If the target type is any timestamp type, convert the String type as 
the default
-      // Timestamp type.
-      case (StringType, AnyTimestampType) =>
-        Some(AnyTimestampType.defaultConcreteType)
-
-      case (DateType, AnyTimestampType) =>
-        Some(AnyTimestampType.defaultConcreteType)
-
-      case (_, target: DataType) =>
-        if (Cast.canANSIStoreAssign(inType, target)) {
-          Some(target)
+      // Ideally the implicit cast rule should be the same as 
`Cast.canANSIStoreAssign` so that it's
+      // consistent with table insertion. To avoid breaking too many existing 
Spark SQL queries,
+      // we make the system to allow implicitly converting String type as 
other primitive types.
+      case (StringType, a @ (_: AtomicType | NumericType | DecimalType | 
AnyTimestampType)) =>
+        Some(a.defaultConcreteType)
+
+      // When the target type is `TypeCollection`, there is another branch to 
find the
+      // "closet convertible data type" below.
+      case (_, target) if !target.isInstanceOf[TypeCollection] =>
+        val concreteType = target.defaultConcreteType

Review Comment:
   It's the default implementation in `DataType`:
   ```
   override private[sql] def defaultConcreteType: DataType = this
   ```
   
   maybe I just make it as `final`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to