Re: Job failed: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-05-16 Thread Nathan Kronenfeld
Serializing the main object isn't going to help here - it's SparkContext it's complaining about. The problem is that the context is, according to the code you sent, computeDwt has a signature of: class DWTSample ... { def computDWT (sc: SparkContext, data: ArrayBuffer[(Int, Double)]):

Re: Job failed: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-05-15 Thread Shivani Rao
This is something that I have bumped into time and again. the object that contains your main() should also be serializable then you won't have this issue. For example object Test extends serializable{ def main(){ // set up spark context // read your data // create your RDD's (grouped by key)