Thanks Todd, this was helpful! I also got some help from the other forum,
and for those that might run into this problem in the future, the solution
that worked for me was:
foreachRDD {r = r.map(x = data(x.getString(0),
x.getInt(1))).saveToCassandra(demo, sqltest)}
On Thu, Jul 9, 2015 at 4:37
Hello All,
I also posted this on the Spark/Datastax thread, but thought it was also
50% a spark question (or mostly a spark question).
I was wondering what is the best practice to saving streaming Spark SQL (
foreachRDD returns a unit:
def foreachRDD(foreachFunc: (RDD
https://spark.apache.org/docs/latest/api/scala/org/apache/spark/rdd/RDD.html
[T]) ⇒ Unit): Unit
Apply a function to each RDD in this DStream. This is an output operator,
so 'this' DStream will be registered as an output stream and