[ https://issues.apache.org/jira/browse/SPARK-27124?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16789366#comment-16789366 ]
Hyukjin Kwon commented on SPARK-27124: -------------------------------------- To me, I am not sure. How would you map Avro schema in PySpark? This is reachable via Py4J FWIW. Also, I'm personally skeptical about exposing those as APIs in general if there aren't strong usecases. > Expose org.apache.spark.sql.avro.SchemaConverters as developer API > ------------------------------------------------------------------ > > Key: SPARK-27124 > URL: https://issues.apache.org/jira/browse/SPARK-27124 > Project: Spark > Issue Type: Improvement > Components: PySpark, SQL > Affects Versions: 3.0.0 > Reporter: Gabor Somogyi > Priority: Minor > > org.apache.spark.sql.avro.SchemaConverters provides extremely useful APIs to > convert schema between Spark SQL and avro. This is reachable from scala side > but not from pyspark. I suggest to add this as a developer API to ease > development for pyspark users. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org