Hi ,
I am writing spark rdd to Ignie Cache .

If I do ,

*val *df = ic.fromCache("EntitySettingsCache").sql("select * from
entity_settingsā€).withColumn("partitionkey",*col*("partitionkey")/2)


ic.fromCache("EntitySettingsCache").saveValues(df.*rdd*)


*Not able to reflect changes in cache* ,


but i do ,


val rddPair: RDD[EntitySettingsKey , EntitySettings] = converRddToPair(rdd)


ic.fromCache("EntitySettingsCache").savePairs(df.*rdd , true*)


above code is relfecting changes in Cache .


If I see implementation of  *saveValues* and *savePairs* ,

there is one difference ,


*streamer.allowOverwrite(overwrite)*


above line is missing in saveValues , can anyone please help me , how do I
solve the problem , because I dont' want to convert *rdd* to *rddPair* manually
everytime .

Reply via email to