Hi Spark experts, Is there a way to convert a JavaSchemaRDD (for instance loaded from a parquet file) back to a JavaRDD of a given case class? I read on stackOverFlow[1] that I could do a select over the parquet file and then by reflection get the fields out, but I guess that would be an overkill. Then I saw [2] from 2014 which says that this feature would be available in the future. So could you please let me know how I can accomplish this? Thanks in advance!
Renato M. [1] http://stackoverflow.com/questions/26181353/how-to-convert-spark-schemardd-into-rdd-of-my-case-class [2] http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Convert-SchemaRDD-back-to-RDD-td9071.html