I am planning to try upgrading spark sql to a newer version of parquet, too. 
I'll let you know if I make progress.

Thanks,

Michael


On Oct 8, 2014, at 12:17 PM, Michael Armbrust <mich...@databricks.com> wrote:

> Thats a good question, I'm not sure if that will work.  I will note that we 
> are hoping to do some upgrades of our parquet support in the near future.
> 
> On Tue, Oct 7, 2014 at 10:33 PM, Michael Allman <mich...@videoamp.com> wrote:
> Hello,
> 
> I was interested in testing Parquet V2 with Spark SQL, but noticed after some 
> investigation that the parquet writer that Spark SQL uses is fixed at V1 
> here: 
> https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableSupport.scala#L350.
>  Any particular reason Spark SQL is hard-coded to write Parquet V1? Should I 
> expect trouble if I write Parquet V2?
> 
> Cheers,
> 
> Michael
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 
> 

Reply via email to