[ https://issues.apache.org/jira/browse/SPARK-21187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Bryan Cutler resolved SPARK-21187. ---------------------------------- Fix Version/s: 3.1.0 Resolution: Fixed With MapType now added, all basic types are supported. I changed nested timestamps/dates to a separate issue and I think we can resolve this now. > Complete support for remaining Spark data types in Arrow Converters > ------------------------------------------------------------------- > > Key: SPARK-21187 > URL: https://issues.apache.org/jira/browse/SPARK-21187 > Project: Spark > Issue Type: Umbrella > Components: PySpark, SQL > Affects Versions: 2.3.0 > Reporter: Bryan Cutler > Assignee: Bryan Cutler > Priority: Major > Fix For: 3.1.0 > > > This is to track adding the remaining type support in Arrow Converters. > Currently, only primitive data types are supported. ' > Remaining types: > * -*Date*- > * -*Timestamp*- > * *Complex*: -Struct-, -Array-, -Map- > * -*Decimal*- > * -*Binary*- > * -*Categorical*- when converting from Pandas > Some things to do before closing this out: > * -Look to upgrading to Arrow 0.7 for better Decimal support (can now write > values as BigDecimal)- > * -Need to add some user docs- > * -Make sure Python tests are thorough- > * Check into complex type support mentioned in comments by [~leif], should > we support mulit-indexing? -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org