On Fri, Jun 19, 2015 at 7:33 AM, Koen Vantomme <koen.vanto...@gmail.com> wrote:
> Hello,
>
> I'm trying to read data from a table stored in cassandra with pyspark.
> I found the scala code to loop through the table :
> "cassandra_rdd.toArray.foreach(println)"
>
> How can this be translated into PySpark  ?
>
> code snipplet :
> sc_cass = CassandraSparkContext(conf=conf)
> cassandra_rdd = sc_cass.cassandraTable("tutorial", "user")
> #cassandra_rdd.toArray.foreach(println)

for row in cassandra_rdd.collect():
  print(row)
> cassandra_rdd.
>
> Regards,
> Koen

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to