I believe your class needs to be defined as a case class (as I answered on SO)..


On 25.2.2015. 5:15, anamika gupta wrote:
Hi Akhil

I guess it skipped my attention. I would definitely give it a try.

While I would still like to know what is the issue with the way I have created schema?

On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das <ak...@sigmoidanalytics.com <mailto:ak...@sigmoidanalytics.com>> wrote:

    Did you happen to have a look at
    
https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

    Thanks
    Best Regards

    On Tue, Feb 24, 2015 at 3:39 PM, anu <anamika.guo...@gmail.com
    <mailto:anamika.guo...@gmail.com>> wrote:

        My issue is posted here on stack-overflow. What am I doing
        wrong here?

        
http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi

        ------------------------------------------------------------------------
        View this message in context: Facing error while extending
        scala class with Product interface to overcome limit of 22
        fields in spark-shell
        
<http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
        Sent from the Apache Spark User List mailing list archive
        <http://apache-spark-user-list.1001560.n3.nabble.com/> at
        Nabble.com.




Reply via email to