Upgraded to 1.1 and the issue is resolved. Thanks.

I still wonder if there is a better way to approach a large attribute
dataset.

On Fri, Sep 12, 2014 at 12:20 PM, Prashant Sharma <scrapco...@gmail.com>
wrote:

> What is your spark version ?  This was fixed I suppose. Can you try it
> with latest release ?
>
> Prashant Sharma
>
>
>
> On Fri, Sep 12, 2014 at 9:47 PM, Ramaraju Indukuri <iramar...@gmail.com>
> wrote:
>
>> This is only a problem in shell, but works fine in batch mode though. I
>> am also interested in how others are solving the problem of case class
>> limitation on number of variables.
>>
>> Regards
>> Ram
>>
>> On Fri, Sep 12, 2014 at 12:12 PM, iramaraju <iramar...@gmail.com> wrote:
>>
>>> I think this is a popular issue, but need help figuring a way around if
>>> this
>>> issue is unresolved. I have a dataset that has more than 70 columns. To
>>> have
>>> all the columns fit into my RDD, I am experimenting the following. (I
>>> intend
>>> to use the InputData to parse the file and have 3 or 4 columnsets to
>>> accommodate the full list of variables)
>>>
>>> case class ColumnSet(C1: Double , C2: Double , C3: Double)
>>> case class InputData(EQN: String, ts: String,Set1 :ColumnSet,Set2
>>> :ColumnSet)
>>>
>>> val  set1 = ColumnSet(1,2,3)
>>> val a = InputData("a","a",set1,set1)
>>>
>>> returns the following
>>>
>>> <console>:16: error: type mismatch;
>>>  found   : ColumnSet
>>>  required: ColumnSet
>>>        val a = InputData("a","a",set1,set1)
>>>
>>> Where as the same code works fine in my scala console.
>>>
>>> Is there a work around for my problem ?
>>>
>>> Regards
>>> Ram
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Nested-Case-Classes-Found-and-Required-Same-tp14096.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>>
>> --
>> ------------------------------------------
>> Ramaraju Indukuri
>>
>
>


-- 
------------------------------------------
Ramaraju Indukuri

Reply via email to