[ https://issues.apache.org/jira/browse/SPARK-16401?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15364970#comment-15364970 ]
Apache Spark commented on SPARK-16401: -------------------------------------- User 'gatorsmile' has created a pull request for this issue: https://github.com/apache/spark/pull/14075 > Data Source APIs: Extending RelationProvider and CreatableRelationProvider > Without SchemaRelationProvider > --------------------------------------------------------------------------------------------------------- > > Key: SPARK-16401 > URL: https://issues.apache.org/jira/browse/SPARK-16401 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Xiao Li > Priority: Critical > > When users try to implement a data source API with extending only > RelationProvider and CreatableRelationProvider, they will hit an error when > resolving the relation. > {noformat} > spark.read > .format("org.apache.spark.sql.test.DefaultSourceWithoutUserSpecifiedSchema") > .load() > .write. > format("org.apache.spark.sql.test.DefaultSourceWithoutUserSpecifiedSchema") > .save() > {noformat} > The error they hit is like > {noformat} > xyzDataSource does not allow user-specified schemas.; > org.apache.spark.sql.AnalysisException: xyzDataSource does not allow > user-specified schemas.; > at > org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:319) > at > org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:494) > at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211) > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org