in writing my own RDD i ran into a few issues with respect to stuff being
private in spark.
in compute i would like to return an iterator that respects task killing
(as HadoopRDD does), but the mechanics for that are inside the private
InterruptibleIterator. also the exception i am supposed
resending... my email somehow never made it to the user list.
On Fri, May 9, 2014 at 2:11 PM, Koert Kuipers ko...@tresata.com wrote:
in writing my own RDD i ran into a few issues with respect to stuff being
private in spark.
in compute i would like to return an iterator that respects task
ko...@tresata.com wrote:
resending... my email somehow never made it to the user list.
On Fri, May 9, 2014 at 2:11 PM, Koert Kuipers ko...@tresata.com wrote:
in writing my own RDD i ran into a few issues with respect to stuff
being private in spark.
in compute i would like to return