Hi, I want to write a new RDD. For testing, I just copied and pasted HadoopRDD.scala into the file newRDD.scala with the appropriate replacements of "HadoopRDD'. It compiles fine at this stage. Now I create newRDD() in SparkContext and the compiler is giving me a type not found error:
.../incubator-spark/core/src/main/scala/org/apache/spark/SparkContext.scala:413: not found: type newRDD new newRDD( ^ I must be missing something trivial here. Any ideas? thanks!