:
org.apache.spark.SparkException: Job failed:
java.io.NotSerializableException: org.apache.spark.SparkContext
org.apache.spark.SparkException: Job failed:
java.io.NotSerializableException: org.apache.spark.SparkContext
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala
: sparkContext
val kk:RDD[(Int,List[Double])]=series.map(t=(t._1,new
DWTsample().computeDwt(sc,t._2)))
Error:
org.apache.spark.SparkException: Job failed:
java.io.NotSerializableException: org.apache.spark.SparkContext
org.apache.spark.SparkException: Job failed:
java.io.NotSerializableException