HyukjinKwon commented on a change in pull request #29365:
URL: https://github.com/apache/spark/pull/29365#discussion_r466091828



##########
File path: python/pyspark/sql/types.py
##########
@@ -1032,7 +1032,13 @@ def _infer_schema(row, names=None):
     else:
         raise TypeError("Can not infer schema for type: %s" % type(row))
 
-    fields = [StructField(k, _infer_type(v), True) for k, v in items]
+    fields = []
+    for k, v in items:
+        try:
+            fields.append(StructField(k, _infer_type(v), True))
+        except TypeError as e:
+            e.args = ("Column {} contains {}".format(k, e.args[0]), )
+            raise e

Review comment:
       I think you can just throw a new exception here instead of trying to 
append a message to the exception. From Python 3, it has an exception chaining 
so it keeps the exception messages. e.g.)
   
   ```python
   >>> try:
   ...     raise Exception("Hi")
   ... except:
   ...     raise Exception("Exception during hi")
   ...
   Traceback (most recent call last):
     File "<stdin>", line 2, in <module>
   Exception: Hi
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File "<stdin>", line 4, in <module>
   Exception: Exception during hi
   ```
   
   I would do, for exmaple,
   
   ```python
   try:
       fields.append(StructField(k, _infer_type(v), True))
   except TypeError as e:
       raise TypeError("Unable to infer the type of the field {}.".format(k))
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to