>>> 16/06/30 10:44:34 ERROR util.Utils: Uncaught exception in thread stdout 
>>> writer for python
java.lang.AbstractMethodError: 
pyspark_cassandra.DeferringRowReader.read(Lcom/datastax/driver/core/Row;Lcom/datastax/spark/connector/CassandraRowMetadata;)Ljava/lang/Object;
>> You are trying to call an abstract method.  Please check the method 
>> DeferringRowReader.read

Do not know how to fix this issue.
Have seen in many tutorials around the net and those ones made the same calling 
I am currently doing

from pyspark_cassandra import CassandraSparkContext, Row
from pyspark import SparkContext, SparkConf
from pyspark.sql import SQLContext
conf = 
SparkConf().setAppName("test").setMaster("spark://192.168.23.31:7077<http://192.168.23.31:7077>").set("spark.cassandra.connection.host",
 "192.168.23.31")
sc = CassandraSparkContext(conf=conf)
table = sc.cassandraTable("lebara_diameter_codes","nl_lebara_diameter_codes")
food_count = table.select("errorcode2001").groupBy("errorcode2001").count()
food_count.collect()

I am really new to this psark thing. Was able to configure it correctly nd now 
learning the API.
This email is confidential and may be subject to privilege. If you are not the 
intended recipient, please do not copy or disclose its content but contact the 
sender immediately upon receipt.

Reply via email to