Hi Zongheng,

Thanks a lot for your reply.

I was edited my codes in my group project and I forgot to remove the package 
declaration...How silly!

Regards,
Haoming

> Date: Thu, 10 Jul 2014 12:00:40 -0700
> Subject: Re: SPARKSQL problem with implementing Scala's Product interface
> From: zonghen...@gmail.com
> To: user@spark.apache.org
> 
> Hi Haoming,
> 
> For your spark-submit question: can you try using an assembly jar
> ("sbt/sbt assembly" will build it for you)? Another thing to check is
> if there is any package structure that contains your SimpleApp; if so
> you should include the hierarchal name.
> 
> Zongheng
> 
> On Thu, Jul 10, 2014 at 11:33 AM, Haoming Zhang
> <haoming.zh...@outlook.com> wrote:
> > Hi Yadid,
> >
> > I have the same problem with you so I implemented the product interface as
> > well, even the codes are similar with your codes. But now I face another
> > problem that is I don't know how to run the codes...My whole program is like
> > this:
> >
> > object SimpleApp {
> >
> >   class Record(val x1: String, val x2: String, val x3: String, ... val x24:
> > String) extends Product with Serializable {
> >     def canEqual(that: Any) = that.isInstanceOf[Record]
> >
> >     def productArity = 24
> >
> >
> >     def productElement(n: Int) = n match {
> >       case 0 => x1
> >       case 1 => x2
> >       case 2 => x3
> >       ...
> >       case 23 => x24
> >     }
> >   }
> >
> >   def main(args: Array[String]) {
> >
> >     val conf = new SparkConf().setAppName("Product Test")
> >     val sc = new SparkContext(conf)
> >     val sqlContext = new SQLContext(sc);
> >
> >     val record = new Record("a", "b", "c", "d", "e", "f", "g", "h", "i",
> > "j", "k", "l", "m", "n", "o", "p", "q", "r", "s", "t", "u", "v", "w", "x")
> >
> >     import sqlContext._
> >     sc.parallelize(record :: Nil).registerAsTable("records")
> >
> >     sql("SELECT x1 FROM records").collect()
> >   }
> > }
> >
> > I tried to run the above program with spark-submit:
> > ./spark-submit --class "SimpleApp" --master local
> > /playground/ProductInterface/target/scala-2.10/classes/product-interface-test_2.10-1.0.jar
> >
> > But I always get the exception that is "Exception in thread "main"
> > java.lang.ClassNotFoundException: SimpleApp".
> >
> > So can you please share me the way to run the test program? Actually I can
> > see there is a SimpleApp.class in classes folder, but I don't understand why
> > spark-submit cannot find it.
> >
> > Best,
> > Haoming
> >
> >> Date: Thu, 10 Jul 2014 09:02:18 -0700
> >> From: ya...@media.mit.edu
> >> To: u...@spark.incubator.apache.org
> >> Subject: SPARKSQL problem with implementing Scala's Product interface
> >
> >>
> >> Hi All,
> >>
> >> I have a class with too many variables to be implemented as a case class,
> >> therefor I am using regular class that implements Scala's product
> >> interface.
> >> Like so:
> >>
> >> class Info () extends Product with Serializable {
> >> var param1 : String = ""
> >> var param2 : String = ""
> >> ...
> >> var param38: String = ""
> >>
> >> def canEqual(that: Any) = that.isInstanceOf[Info]
> >> def productArity = 38
> >> def productElement(n: Int) = n match {
> >> case 0 => param1
> >> case 1 => param2
> >> ...
> >> case 37 => param38
> >> }
> >> }
> >>
> >> after registering the table as info when I execute "SELECT * from info" I
> >> get the expected result.
> >> However, when I execute "SELECT param1, param2 from info" I get the
> >> following exception:
> >> Loss was due to
> >> org.apache.spark.sql.catalyst.errors.package$TreeNodeException: No
> >> function
> >> to evaluate expression. type: UnresolvedAttribute, tree: 'param1
> >>
> >> I guess I must be missing a method in the implementation. Any pointers
> >> appreciated.
> >>
> >> Yadid
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >>
> >> --
> >> View this message in context:
> >> http://apache-spark-user-list.1001560.n3.nabble.com/SPARKSQL-problem-with-implementing-Scala-s-Product-interface-tp9311.html
> >> Sent from the Apache Spark User List mailing list archive at Nabble.com.
                                          

Reply via email to