Xiangrui is correct that is must be a java bean, also nested classes are
not yet supported in java.

On Tue, Nov 11, 2014 at 10:11 AM, Xiangrui Meng <men...@gmail.com> wrote:

> I think you need a Java bean class instead of a normal class. See
> example here:
> http://spark.apache.org/docs/1.1.0/sql-programming-guide.html
> (switch to the java tab). -Xiangrui
>
> On Tue, Nov 11, 2014 at 7:18 AM, Naveen Kumar Pokala
> <npok...@spcapitaliq.com> wrote:
> > Hi,
> >
> >
> >
> > This is my Instrument java constructor.
> >
> >
> >
> > public Instrument(Issue issue, Issuer issuer, Issuing issuing) {
> >
> >                                 super();
> >
> >                                 this.issue = issue;
> >
> >                                 this.issuer = issuer;
> >
> >                                 this.issuing = issuing;
> >
> >                 }
> >
> >
> >
> >
> >
> > I am trying to create javaschemaRDD
> >
> >
> >
> > JavaSchemaRDD schemaInstruments = sqlCtx.applySchema(distData,
> > Instrument.class);
> >
> >
> >
> > Remarks:
> >
> > ============
> >
> >
> >
> > Instrument, Issue, Issuer, Issuing all are java classes
> >
> >
> >
> > distData is holding List< Instrument >
> >
> >
> >
> >
> >
> > I am getting the following error.
> >
> >
> >
> >
> >
> >
> >
> > Exception in thread "Driver" java.lang.reflect.InvocationTargetException
> >
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >
> >         at java.lang.reflect.Method.invoke(Method.java:483)
> >
> >         at
> >
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162)
> >
> > Caused by: scala.MatchError: class sample.spark.test.Issue (of class
> > java.lang.Class)
> >
> >         at
> >
> org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:189)
> >
> >         at
> >
> org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:188)
> >
> >         at
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> >
> >         at
> >
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> >
> >         at
> >
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> >
> >         at
> > scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
> >
> >         at
> > scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
> >
> >         at
> scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
> >
> >         at
> >
> org.apache.spark.sql.api.java.JavaSQLContext.getSchema(JavaSQLContext.scala:188)
> >
> >         at
> >
> org.apache.spark.sql.api.java.JavaSQLContext.applySchema(JavaSQLContext.scala:90)
> >
> >         at sample.spark.test.SparkJob.main(SparkJob.java:33)
> >
> >         ... 5 more
> >
> >
> >
> > Please help me.
> >
> >
> >
> > Regards,
> >
> > Naveen.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to