I would say change
class RawDataInputFormat[LW <: LongWritable, RD <: RDRawDataRecord]
extends FileInputFormat
to
class RawDataInputFormat[LongWritable, RDRawDataRecord] extends FileInputFormat
On Thu, Mar 17, 2016 at 9:48 AM, Mich Talebzadeh
wrote:
> Hi Tony,
>
Sorry for latter reply. Yep, RDRawDataRecord is my object, It defined in
other java project(jar.), I get it with maven. My MapReduce program also
use it and works.
On Fri, Mar 18, 2016 at 12:48 AM, Mich Talebzadeh wrote:
> Hi Tony,
>
> Is
>
>
Doesn't FileInputFormat require type parameters? Like so:
class RawDataInputFormat[LW <: LongWritable, RD <: RDRawDataRecord]
extends FileInputFormat[LW, RD]
I haven't verified this but it could be related to the compile error
you're getting.
On Thu, Mar 17, 2016 at 9:53 AM, Benyi Wang
Hi Tony,
Is
com.kiisoo.aegis.bd.common.hdfs.RDRawDataRecord
One of your own packages?
Sounds like it is one throwing the error
HTH
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
I also tried before, but in RawReader.next(key, value) method, invoke
reader.next method get an error. it says: Type Mismatch.
On Fri, Mar 18, 2016 at 12:53 AM, Benyi Wang wrote:
> I would say change
>
> class RawDataInputFormat[LW <: LongWritable, RD <: RDRawDataRecord]
Hi,
My HDFS file is store with custom data structures. I want to read it
with SparkContext object.So I define a formatting object:
*1. code of RawDataInputFormat.scala*
import com.kiisoo.aegis.bd.common.hdfs.RDRawDataRecord
import org.apache.hadoop.io.LongWritable
import