HyukjinKwon commented on a change in pull request #34931:
URL: https://github.com/apache/spark/pull/34931#discussion_r773008409



##########
File path: python/pyspark/pandas/frame.py
##########
@@ -8809,16 +8814,29 @@ def describe(self, percentiles: Optional[List[float]] = 
None) -> "DataFrame":
         max      3.0
         Name: numeric1, dtype: float64
         """
-        exprs = []
+        exprs_numeric = []
+        exprs_non_numeric = []
         column_labels = []
+        # For storing the name of non-numeric type columns
+        column_names = []
+        is_timestamp_types = []
         for label in self._internal.column_labels:
             psser = self._psser_for(label)
-            if isinstance(psser.spark.data_type, NumericType):
-                exprs.append(psser._dtype_op.nan_to_null(psser).spark.column)
+            spark_data_type = psser.spark.data_type
+            if isinstance(spark_data_type, (NumericType, TimestampType, 
TimestampNTZType)):
+                
exprs_numeric.append(psser._dtype_op.nan_to_null(psser).spark.column)
                 column_labels.append(label)
-
-        if len(exprs) == 0:
-            raise ValueError("Cannot describe a DataFrame without columns")
+                # For checking if the column has timestamp type.
+                # We should handle the timestamp type differently from numeric 
type.
+                is_timestamp_types.append(
+                    (
+                        isinstance(spark_data_type, (TimestampType, 
TimestampNTZType)),

Review comment:
       Can we just add another list called something like `exprs_timestamp`, 
and store them? the variable name `is_timestamp_types` holding pairs looks a 
bit weird.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to