You could take a look at sc.newAPIHadoopRDD()

在 2014年11月5日,上午9:29,Corey Nolet <cjno...@gmail.com> 写道:

> I'm fairly new to spark and I'm trying to kick the tires with a few 
> InputFormats. I noticed the sc.hadoopRDD() method takes a mapred JobConf 
> instead of a MapReduce Job object. Is there future planned support for the 
> mapreduce packaging?
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to