[ https://issues.apache.org/jira/browse/SPARK-27124?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16789376#comment-16789376 ]
Gabor Somogyi commented on SPARK-27124: --------------------------------------- {quote}This is reachable via Py4J FWIW{quote} How exactly? Maybe just doc has to be extended to highlight this functionality. >From use-case perspective I've seen many users struggling with avro and report >problems but most of the time it was caused by wrong schema. I would like to mitigate this somehow. > Expose org.apache.spark.sql.avro.SchemaConverters as developer API > ------------------------------------------------------------------ > > Key: SPARK-27124 > URL: https://issues.apache.org/jira/browse/SPARK-27124 > Project: Spark > Issue Type: Improvement > Components: PySpark, SQL > Affects Versions: 3.0.0 > Reporter: Gabor Somogyi > Priority: Minor > > org.apache.spark.sql.avro.SchemaConverters provides extremely useful APIs to > convert schema between Spark SQL and avro. This is reachable from scala side > but not from pyspark. I suggest to add this as a developer API to ease > development for pyspark users. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org