Hi, I am newbie in Spark and performed following steps during POC execution:
1. Map csv file to object-file after some transformations once. 2. Serialize object-file to RDD for operation, as per need. In case of 2 csv/object-files, first object-file is serialized to RDD successfully but during serialization of second object-file error appears. This error occurs only when spark-shell is restarted between step-1 and step-2. Please suggest how to serialize 2 object-files. Also find below executed code on spark-shell ******************************************* //#1//Start spark-shell and csv to object-file creation val sqlContext = new org.apache.spark.sql.SQLContext(sc) case class person(id: Int, name: String, fathername: String, officeid: Int) val baseperson = sc.textFile("person_csv").flatMap(line => line.split("\n")).map(_.split(",")) baseperson.map(p => person(p(0).trim.toInt, p(1), p(2), p(3).trim.toInt)).saveAsObjectFile("person_obj") case class office(id: Int, name: String, landmark: String, areacode: String) val baseoffice = sc.textFile("office_csv").flatMap(line => line.split("\n")).map(_.split(",")) baseoffice.map(p => office(p(0).trim.toInt, p(1), p(2), p(3))).saveAsObjectFile("office_obj") //#2//Stop spark-shell //#3//Start spark-shell and map object-file val sqlContext = new org.apache.spark.sql.SQLContext(sc) case class person(id: Int, name: String, fathername: String, officeid: Int) case class office(id: Int, name: String, landmark: String, areacode: String) sc.objectFile[person]("person_obj").count [OK] sc.objectFile[office]("office_obj").count *[FAILS]* ******************************************* stack trace is attached stacktrace.txt <http://apache-spark-user-list.1001560.n3.nabble.com/file/n20334/stacktrace.txt> rahul@... ******************************************* Regards, Rahul -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/serialization-issue-in-case-of-case-class-is-more-than-1-tp20334.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org