The link has proved helpful. I have been able to load data, register it as
a table and perform simple queries. Thanks Akhil !!

Though, I still look forward to knowing where I was going wrong with my
previous technique of extending the Product Interface to overcome case
class's limit of 22 fields.

On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta <anamika.guo...@gmail.com>
wrote:

> Hi Akhil
>
> I guess it skipped my attention. I would definitely give it a try.
>
> While I would still like to know what is the issue with the way I have
> created schema?
>
> On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> Did you happen to have a look at
>> https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Feb 24, 2015 at 3:39 PM, anu <anamika.guo...@gmail.com> wrote:
>>
>>> My issue is posted here on stack-overflow. What am I doing wrong here?
>>>
>>>
>>> http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi
>>>
>>> ------------------------------
>>> View this message in context: Facing error while extending scala class
>>> with Product interface to overcome limit of 22 fields in spark-shell
>>> <http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html>
>>> Sent from the Apache Spark User List mailing list archive
>>> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>>>
>>
>>
>

Reply via email to