Re: Parquet SaveMode.Append Trouble.

2015-08-04 Thread Cheng Lian

You need to import org.apache.spark.sql.SaveMode

Cheng

On 7/31/15 6:26 AM, satyajit vegesna wrote:

Hi,

I am new to using Spark and Parquet files,

Below is what i am trying to do, on Spark-shell,

val df = 
sqlContext.parquetFile(/data/LM/Parquet/Segment/pages/part-m-0.gz.parquet) 


Have also tried below command,

val 
df=sqlContext.read.format(parquet).load(/data/LM/Parquet/Segment/pages/part-m-0.gz.parquet)


Now i have an other existing parquet file to which i want to append 
this Parquet file data of df.


so i use,

df.save(/data/LM/Parquet/Segment/pages2/part-m-0.gz.parquet,parquet, 
SaveMode.Append )


also tried below command,

df.save(/data/LM/Parquet/Segment/pages2/part-m-0.gz.parquet, 
SaveMode.Append )



and it throws me below error,

console:26: error: not found: value SaveMode
df.save(/data/LM/Parquet/Segment/pages2/part-m-0.gz.parquet,parquet, 
SaveMode.Append )


Please help me, in case i am doing something wrong here.

Regards,
Satyajit.






-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Parquet SaveMode.Append Trouble.

2015-07-30 Thread satyajit vegesna
Hi,

I am new to using Spark and Parquet files,

Below is what i am trying to do, on Spark-shell,

val df =
sqlContext.parquetFile(/data/LM/Parquet/Segment/pages/part-m-0.gz.parquet)
Have also tried below command,

val
df=sqlContext.read.format(parquet).load(/data/LM/Parquet/Segment/pages/part-m-0.gz.parquet)

Now i have an other existing parquet file to which i want to append this
Parquet file data of df.

so i use,

df.save(/data/LM/Parquet/Segment/pages2/part-m-0.gz.parquet,parquet,
SaveMode.Append )

also tried below command,

df.save(/data/LM/Parquet/Segment/pages2/part-m-0.gz.parquet,
SaveMode.Append )


and it throws me below error,

console:26: error: not found: value SaveMode

df.save(/data/LM/Parquet/Segment/pages2/part-m-0.gz.parquet,parquet,
SaveMode.Append )

Please help me, in case i am doing something wrong here.

Regards,
Satyajit.