Looking closer at the code you posted, the error likely was caused by the
3rd parameter: Void.class

It is supposed to be the class of key.

FYI

On Fri, Jul 31, 2015 at 11:24 AM, unk1102 <umesh.ka...@gmail.com> wrote:

> Hi I am having my own Hadoop custom InputFormat which I need to use in
> creating DataFrame. I tried to do the following
>
> JavaPairRDD<Void,MyRecordWritable> myFormatAsPairRdd =
>
> jsc.hadoopFile("hdfs://tmp/data/myformat.xyz",MyInputFormat.class,Void.class,MyRecordWritable.class);
> JavaRDD<MyRecordWritable> myformatRdd =  myFormatAsPairRdd.values();
> DataFrame myFormatAsDataframe =
> sqlContext.createDataFrame(myformatRdd,MyFormatSchema.class);
> myFormatAsDataframe.show();
>
> Above code does not work and throws exception saying
> java.lang.IllegalArgumentException object is not an instance of declaring
> class
>
> My custom Hadoop InputFormat works very well with Hive,MapReduce etc How do
> I make it work with Spark please guide I am new to Spark. Thank in advance.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-create-Spark-DataFrame-using-custom-Hadoop-InputFormat-tp24101.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to