Creating dataframes and union them looks reasonable. thanks, Wei
On Mon, May 11, 2015 at 6:39 PM, Michael Armbrust <mich...@databricks.com> wrote: > Ah, yeah sorry. I should have read closer and realized that what you are > asking for is not supported. It might be possible to add simple coercions > such as this one, but today, compatible schemas must only add/remove > columns and cannot change types. > > You could try creating different dataframes and unionAll them. Coercions > should be inserted automatically in that case. > > On Mon, May 11, 2015 at 3:37 PM, Wei Yan <ywsk...@gmail.com> wrote: > >> Thanks for the reply, Michael. >> >> The problem is, if I set "spark.sql.parquet.useDataSourceApi" to true, >> spark cannot create a DataFrame. The exception shows it "failed to merge >> incompatible schemas". I think here it means that, the "int" schema cannot >> be merged with the "long" one. >> Does it mean that the schema merging doesn't support the same field with >> different types? >> >> -Wei >> >> On Mon, May 11, 2015 at 3:10 PM, Michael Armbrust <mich...@databricks.com >> > wrote: >> >>> BTW, I use spark 1.3.1, and already set >>>> "spark.sql.parquet.useDataSourceApi" to false. >>>> >>> >>> Schema merging is only supported when this flag is set to true (setting >>> it to false uses old code that will be removed once the new code is >>> proven). >>> >> >> >