Re: [X-post] Saving SparkSQL result RDD to Cassandra

2015-07-09 Thread Su She
Thanks Todd, this was helpful! I also got some help from the other forum, and for those that might run into this problem in the future, the solution that worked for me was: foreachRDD {r => r.map(x => data(x.getString(0), x.getInt(1))).saveToCassandra("demo", "sqltest")} On Thu, Jul 9, 2015 at 4:

Re: [X-post] Saving SparkSQL result RDD to Cassandra

2015-07-09 Thread Todd Nist
foreachRDD returns a unit: def foreachRDD(foreachFunc: (RDD [T]) ⇒ Unit): Unit Apply a function to each RDD in this DStream. This is an output operator, so 'this' DStream will be registered as an output stream and ther

[X-post] Saving SparkSQL result RDD to Cassandra

2015-07-09 Thread Su She
Hello All, I also posted this on the Spark/Datastax thread, but thought it was also 50% a spark question (or mostly a spark question). I was wondering what is the best practice to saving streaming Spark SQL ( https://github.com/Intel-bigdata/spark-streamingsql/blob/master/src/main/scala/org/apach