Github user BryanCutler commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19884#discussion_r155626249
  
    --- Diff: python/pyspark/sql/types.py ---
    @@ -1658,13 +1657,13 @@ def from_arrow_type(at):
             spark_type = FloatType()
         elif at == pa.float64():
             spark_type = DoubleType()
    -    elif type(at) == pa.DecimalType:
    +    elif pa.types.is_decimal(at):
             spark_type = DecimalType(precision=at.precision, scale=at.scale)
    -    elif at == pa.string():
    +    elif pa.types.is_string(at):
             spark_type = StringType()
         elif at == pa.date32():
             spark_type = DateType()
    -    elif type(at) == pa.TimestampType:
    +    elif pa.types.is_timestamp(at):
    --- End diff --
    
    @icexelloss @wesm is this the recommended way to check type id for the 
latest pyarrow?  For types with a single bit width, I am using the is_* 
functions, like `is_timestamp`, but for others I still need to check object 
equality such as `t == pa.date32()` because there is no `is_date32()` only 
`is_date()`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to