zhengruifeng commented on code in PR #42783:
URL: https://github.com/apache/spark/pull/42783#discussion_r1314365390


##########
python/pyspark/sql/functions.py:
##########
@@ -15748,6 +15749,33 @@ def java_method(*cols: "ColumnOrName") -> Column:
     return _invoke_function_over_seq_of_columns("java_method", cols)
 
 
+@try_remote_functions
+def try_reflect(*cols: "ColumnOrName") -> Column:
+    """
+    This is a special version of `reflect` that performs the same operation, 
but returns a NULL
+    value instead of raising an error if the invoke method thrown exception.
+
+
+    .. versionadded:: 4.0.0
+
+    Parameters
+    ----------
+    cols : :class:`~pyspark.sql.Column` or str
+        the first element should be a literal string for the class name,
+        and the second element should be a literal string for the method name,
+        and the remaining are input arguments to the Java method.
+
+    Examples
+    --------
+    >>> df = 
spark.createDataFrame([("a5cf6c42-0c85-418f-af6c-3e4e5b1328f2",)], ["a"])
+    >>> df.select(
+    ...     try_reflect(lit("java.util.UUID"), lit("fromString"), 
df.a).alias('r')
+    ... ).collect()

Review Comment:
   ```suggestion
       >>> from pyspark.sql import functions as sf
       >>> df = 
spark.createDataFrame([("a5cf6c42-0c85-418f-af6c-3e4e5b1328f2",)], ["a"])
       >>> df.select(
       ...     sf.try_reflect(lit("java.util.UUID"), sf.lit("fromString"), df.a)
       ... ).show()
   ```
   
   for the new docstring, please follow `from pyspark.sql import functions as 
sf` to import
   
   also please do not use `.alias('r')`, so that we can also check whether the 
default output column name is consisent with SQL



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to