Github user BryanCutler commented on the issue:

    https://github.com/apache/spark/pull/22913
  
    I'm a little against adding this because the Arrow Java Vectors used so far 
were done to match the internal data of Spark, to keep things simple and avoid 
lots of conversions on the Java side. Conversions to supported types are being 
done before reading the data in Java. 
    
    For instance, there are lots of timestamp types for other time units, but 
we only accept microseconds with tz in Java (to match Spark) and do any 
necessary conversions in Python before writing the Arrow data. Can this also be 
done in R?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to