Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-26 Thread Patrick Varilly
By the way, the limitation of case classes to 22 parameters was removed in https://issues.scala-lang.org/browse/SI-7296 Scala 2.11 https://issues.scala-lang.org/browse/SI-7098 (there's some technical rough edge https://github.com/scala/scala/pull/2305 past 22 that you most likely will never run

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-26 Thread anamika gupta
Hi Patrick Thanks a ton for your in-depth answer. The compilation error is now resolved. Thanks a lot again !! On Thu, Feb 26, 2015 at 2:40 PM, Patrick Varilly patrick.vari...@dataminded.be wrote: Hi, Akhil, In your definition of sdp_d

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread anamika gupta
I am now getting the following error. I cross-checked my types and corrected three of them i.e. r26--String, r27--Timestamp, r28--Timestamp. This error still persists. scala sc.textFile(/home/cdhuser/Desktop/Sdp_d.csv).map(_.split(,)).map { r = | val upto_time = sdf.parse(r(23).trim); |

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread anamika gupta
The link has proved helpful. I have been able to load data, register it as a table and perform simple queries. Thanks Akhil !! Though, I still look forward to knowing where I was going wrong with my previous technique of extending the Product Interface to overcome case class's limit of 22 fields.

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread Akhil Das
It says sdp_d not found, since it is a class you need to instantiate it once. like: sc.textFile(derby.log).map(_.split(,)).map( r = { val upto_time = sdf.parse(r(23).trim); calendar.setTime(upto_time); val r23 = new java.sql.Timestamp(upto_time.getTime);

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread Petar Zecevic
I believe your class needs to be defined as a case class (as I answered on SO).. On 25.2.2015. 5:15, anamika gupta wrote: Hi Akhil I guess it skipped my attention. I would definitely give it a try. While I would still like to know what is the issue with the way I have created schema?

Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread anu
My issue is posted here on stack-overflow. What am I doing wrong here? http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi -- View this message in context:

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread Akhil Das
Did you happen to have a look at https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema Thanks Best Regards On Tue, Feb 24, 2015 at 3:39 PM, anu anamika.guo...@gmail.com wrote: My issue is posted here on stack-overflow. What am I doing wrong

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread anamika gupta
Hi Akhil I guess it skipped my attention. I would definitely give it a try. While I would still like to know what is the issue with the way I have created schema? On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das ak...@sigmoidanalytics.com wrote: Did you happen to have a look at