Hi.

I have a dataframe and I want to insert these data into parquet partitioned
table in Hive.

In Spark 1.4 I can use
df.write.partitionBy("x","y").format("parquet").mode("append").saveAsTable("tbl_parquet")

but in Spark 1.3 I can't. How can I do it?

Thanks

-- 
Regards
Miguel

Reply via email to