itholic commented on code in PR #37966: URL: https://github.com/apache/spark/pull/37966#discussion_r978243957
########## python/pyspark/sql/functions.py: ########## @@ -164,13 +166,24 @@ def lit(col: Any) -> Column: +--------------+ | [1, 2, 3]| +--------------+ + + Create a literal from numpy.ndarray. + + >>> spark.range(1).select(lit(np.ndarray([1, 2]))).show(truncate=False) + +------------------------------------------------------+ + |array(array(2.058335917824E-312, 2.334195370625E-312))| + +------------------------------------------------------+ + |[[2.058335917824E-312, 2.334195370625E-312]] | Review Comment: Seems like the type in this example is correct since `np.ndarray([1, 2])` returns `np.float64` as below: e.g. ```python >>> np.ndarray([1, 2]) array([[2.05833592e-312, 2.33419537e-312]]) >>> np.ndarray([1, 2]).dtype dtype('float64') >>> spark.range(1).select(lit(np.ndarray([1, 2]))).dtypes [('array(array(6.94998724E-310, 6.95001394E-310))', 'array<array<double>>')] ``` However, they do not respect NumPy types in other cases as below: ```python >>> spark.range(1).select(lit(np.array([[np.int8(1), np.int8(2)]]))).dtypes [('array(array(1, 2))', 'array<array<int>>')] ``` This should return `smallint` rather than `int` when the given type is `np.int8`: ```python >>> spark.range(1).select(lit(np.array([[np.int8(1), np.int8(2)]]))).dtypes [('ARRAY(ARRAY(1S, 2S))', 'array<array<smallint>>')] ``` So, I think the PR title also should be addressed such as "Support np.ndarray for `functions.lit` for multi dimensions" Now we only respect the NumPy type only for the single dimension. Let me address it. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org