Starting from Spark 1.4, you can do this via dynamic partitioning:
sqlContext.table("trade").write.partitionBy("date").parquet("/tmp/path")
Cheng
On 9/1/15 8:27 AM, gtinside wrote:
Hi ,
I have a set of data, I need to group by specific key and then save as
parquet. Refer to the code snippet
Hi ,
I have a set of data, I need to group by specific key and then save as
parquet. Refer to the code snippet below. I am querying trade and then
grouping by date
val df = sqlContext.sql("SELECT * FROM trade")
val dfSchema = df.schema
val partitionKeyIndex =