Hi ,

I have a question on spark
this programm on spark-shell

val filerdd = sc.textFile("NOTICE",2)
val maprdd = filerdd.map( word => filerdd.map( word2 => (word2+word)  ) )
maprdd.collect()

throws NULL pointer exception ,
can somebody explain why i cannot have a nested rdd operation ?

--pavlos

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to