Re: method newAPIHadoopFile

2015-02-25 Thread patcharee
I tried val pairVarOriRDD = sc.newAPIHadoopFile(path, classOf[NetCDFFileInputFormat].asSubclass( classOf[org.apache.hadoop.mapreduce.lib.input.FileInputFormat[WRFIndex,WRFVariable]]), classOf[WRFIndex], classOf[WRFVariable], jobConf) The compiler does not

Re: method newAPIHadoopFile

2015-02-25 Thread patcharee
This is the declaration of my custom inputformat public class NetCDFFileInputFormat extends ArrayBasedFileInputFormat public abstract class ArrayBasedFileInputFormat extends org.apache.hadoop.mapreduce.lib.input.FileInputFormat Best, Patcharee On 25. feb. 2015 10:15, patcharee wrote: Hi,

Re: method newAPIHadoopFile

2015-02-25 Thread Sean Owen
OK, from the declaration you sent me separately: public class NetCDFFileInputFormat extends ArrayBasedFileInputFormat public abstract class ArrayBasedFileInputFormat extends org.apache.hadoop.mapreduce.lib.input.FileInputFormat It looks like you do not declare any generic types that

method newAPIHadoopFile

2015-02-25 Thread patcharee
Hi, I am new to spark and scala. I have a custom inputformat (used before with mapreduce) and I am trying to use it in spark. In java api (the syntax is correct): JavaPairRDDWRFIndex, WRFVariable pairVarOriRDD = sc.newAPIHadoopFile( path, NetCDFFileInputFormat.class,