I tried
val pairVarOriRDD = sc.newAPIHadoopFile(path,
classOf[NetCDFFileInputFormat].asSubclass(
classOf[org.apache.hadoop.mapreduce.lib.input.FileInputFormat[WRFIndex,WRFVariable]]),
classOf[WRFIndex],
classOf[WRFVariable],
jobConf)
The compiler does not
This is the declaration of my custom inputformat
public class NetCDFFileInputFormat extends ArrayBasedFileInputFormat
public abstract class ArrayBasedFileInputFormat extends
org.apache.hadoop.mapreduce.lib.input.FileInputFormat
Best,
Patcharee
On 25. feb. 2015 10:15, patcharee wrote:
Hi,
OK, from the declaration you sent me separately:
public class NetCDFFileInputFormat extends ArrayBasedFileInputFormat
public abstract class ArrayBasedFileInputFormat extends
org.apache.hadoop.mapreduce.lib.input.FileInputFormat
It looks like you do not declare any generic types that
Hi,
I am new to spark and scala. I have a custom inputformat (used before
with mapreduce) and I am trying to use it in spark.
In java api (the syntax is correct):
JavaPairRDDWRFIndex, WRFVariable pairVarOriRDD = sc.newAPIHadoopFile(
path,
NetCDFFileInputFormat.class,