Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-26 Thread Patrick Varilly
By the way, the limitation of case classes to 22 parameters was removed in
https://issues.scala-lang.org/browse/SI-7296 Scala 2.11
https://issues.scala-lang.org/browse/SI-7098 (there's some technical
rough edge https://github.com/scala/scala/pull/2305 past 22 that you most
likely will never run into, but past 255, you run into underlying
limitations of the JVM https://issues.scala-lang.org/browse/SI-7324).


Best,

Patrick

On Thu, Feb 26, 2015 at 11:58 AM, anamika gupta anamika.guo...@gmail.com
wrote:

 Hi Patrick

 Thanks a ton for your in-depth answer. The compilation error is now
 resolved.

 Thanks a lot again !!

 On Thu, Feb 26, 2015 at 2:40 PM, Patrick Varilly 
 patrick.vari...@dataminded.be wrote:

 Hi, Akhil,

 In your definition of sdp_d
 http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi,
 all your fields are of type Option[X].  In Scala, a value of type Option[X]
 can hold one of two things:

 1. None
 2. Some(x), where x is of type X

 So to fix your immediate problem, wrap all your parameters to the sdp_d
 constructor in Some(...), as follows:

new sdp_d(Some(r(0).trim.toInt), Some(r(1).trim.toInt),
 Some(r(2).trim), ...

 Your earlier question of why writing sdp_d(...) for a case class works
 but you need to write new sdp_d(...) for an explicit class, there's a
 simple answer.  When you create a case class X in scala, Scala also makes a
 companion object X behind the scenes with an apply method that calls new
 (see below).  Scala's rules will call this apply method automatically.  So,
 when you write X(...), you're really calling X.apply(...) which in turn
 calls new X(...).  (This is the same trick behind writing things like
 List(1,2,3))  If you don't use a case class, you'd have to make the
 companion object yourself explicitly.

 For reference, this statement:

case class X(a: A, b: B)

 is conceptually equivalent to

class X(val a: A, val b: B) extends ... {

   override def toString: String = // Auto-generated
   override def hashCode: Int = // Auto-generated
   override def equals(that: Any): Boolean = // Auto-generated

   ... more convenience methods ...
}

object X {
   def apply(a: A, b: B) = new X(a, b)
   ... more convenience methods ...
}

 If you want to peek under the hood, try compiling a simple X.scala file
 with the line case class X(a: Int, b: Double), then taking apart the
 generated X.class and X$.class (e.g., javap X.class).

 More info here
 http://docs.scala-lang.org/tutorials/tour/case-classes.html, here
 http://www.scala-lang.org/docu/files/ScalaReference.pdf and in Programming
 in Scala http://www.artima.com/shop/programming_in_scala_2ed ch 15.

 Hope that helps!

 Best,

 Patrick

 On Thu, Feb 26, 2015 at 6:37 AM, anamika gupta anamika.guo...@gmail.com
 wrote:

 I am now getting the following error. I cross-checked my types and
 corrected three of them i.e. r26--String, r27--Timestamp,
 r28--Timestamp. This error still persists.

 scala
 sc.textFile(/home/cdhuser/Desktop/Sdp_d.csv).map(_.split(,)).map { r =
  | val upto_time = sdf.parse(r(23).trim);
  | calendar.setTime(upto_time);
  | val r23 = new java.sql.Timestamp(upto_time.getTime)
  | val insert_time = sdf.parse(r(27).trim)
  | calendar.setTime(insert_time)
  | val r27 = new java.sql.Timestamp(insert_time.getTime)
  | val last_upd_time = sdf.parse(r(28).trim)
  | calendar.setTime(last_upd_time)
  | val r28 = new java.sql.Timestamp(last_upd_time.getTime)
  | new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
 r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
 r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
 r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
 r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
 r(25).trim, r(26).trim, r27, r28)
  | }.registerAsTable(sdp_d)

 console:26: error: type mismatch;
  found   : Int
  required: Option[Int]
   new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
 r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
 r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
 r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
 r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
 r(25).trim, r(26).trim, r27, r28)

 On Wed, Feb 25, 2015 at 2:32 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 It says sdp_d not found, since it is a class you need to instantiate it
 once. like:

 sc.textFile(derby.log).map(_.split(,)).map( r = {
   val upto_time = sdf.parse(r(23).trim);
   calendar.setTime(upto_time);
   val r23 = new java.sql.Timestamp(upto_time.getTime);

   val insert_time = sdf.parse(r(26).trim);
   calendar.setTime(insert_time);
   val r26 = new 

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-26 Thread anamika gupta
Hi Patrick

Thanks a ton for your in-depth answer. The compilation error is now
resolved.

Thanks a lot again !!

On Thu, Feb 26, 2015 at 2:40 PM, Patrick Varilly 
patrick.vari...@dataminded.be wrote:

 Hi, Akhil,

 In your definition of sdp_d
 http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi,
 all your fields are of type Option[X].  In Scala, a value of type Option[X]
 can hold one of two things:

 1. None
 2. Some(x), where x is of type X

 So to fix your immediate problem, wrap all your parameters to the sdp_d
 constructor in Some(...), as follows:

new sdp_d(Some(r(0).trim.toInt), Some(r(1).trim.toInt),
 Some(r(2).trim), ...

 Your earlier question of why writing sdp_d(...) for a case class works but
 you need to write new sdp_d(...) for an explicit class, there's a simple
 answer.  When you create a case class X in scala, Scala also makes a
 companion object X behind the scenes with an apply method that calls new
 (see below).  Scala's rules will call this apply method automatically.  So,
 when you write X(...), you're really calling X.apply(...) which in turn
 calls new X(...).  (This is the same trick behind writing things like
 List(1,2,3))  If you don't use a case class, you'd have to make the
 companion object yourself explicitly.

 For reference, this statement:

case class X(a: A, b: B)

 is conceptually equivalent to

class X(val a: A, val b: B) extends ... {

   override def toString: String = // Auto-generated
   override def hashCode: Int = // Auto-generated
   override def equals(that: Any): Boolean = // Auto-generated

   ... more convenience methods ...
}

object X {
   def apply(a: A, b: B) = new X(a, b)
   ... more convenience methods ...
}

 If you want to peek under the hood, try compiling a simple X.scala file
 with the line case class X(a: Int, b: Double), then taking apart the
 generated X.class and X$.class (e.g., javap X.class).

 More info here
 http://docs.scala-lang.org/tutorials/tour/case-classes.html, here
 http://www.scala-lang.org/docu/files/ScalaReference.pdf and in Programming
 in Scala http://www.artima.com/shop/programming_in_scala_2ed ch 15.

 Hope that helps!

 Best,

 Patrick

 On Thu, Feb 26, 2015 at 6:37 AM, anamika gupta anamika.guo...@gmail.com
 wrote:

 I am now getting the following error. I cross-checked my types and
 corrected three of them i.e. r26--String, r27--Timestamp,
 r28--Timestamp. This error still persists.

 scala
 sc.textFile(/home/cdhuser/Desktop/Sdp_d.csv).map(_.split(,)).map { r =
  | val upto_time = sdf.parse(r(23).trim);
  | calendar.setTime(upto_time);
  | val r23 = new java.sql.Timestamp(upto_time.getTime)
  | val insert_time = sdf.parse(r(27).trim)
  | calendar.setTime(insert_time)
  | val r27 = new java.sql.Timestamp(insert_time.getTime)
  | val last_upd_time = sdf.parse(r(28).trim)
  | calendar.setTime(last_upd_time)
  | val r28 = new java.sql.Timestamp(last_upd_time.getTime)
  | new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
 r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
 r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
 r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
 r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
 r(25).trim, r(26).trim, r27, r28)
  | }.registerAsTable(sdp_d)

 console:26: error: type mismatch;
  found   : Int
  required: Option[Int]
   new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
 r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
 r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
 r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
 r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
 r(25).trim, r(26).trim, r27, r28)

 On Wed, Feb 25, 2015 at 2:32 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 It says sdp_d not found, since it is a class you need to instantiate it
 once. like:

 sc.textFile(derby.log).map(_.split(,)).map( r = {
   val upto_time = sdf.parse(r(23).trim);
   calendar.setTime(upto_time);
   val r23 = new java.sql.Timestamp(upto_time.getTime);

   val insert_time = sdf.parse(r(26).trim);
   calendar.setTime(insert_time);
   val r26 = new java.sql.Timestamp(insert_time.getTime);

   val last_upd_time = sdf.parse(r(27).trim);
   calendar.setTime(last_upd_time);
   val r27 = new java.sql.Timestamp(last_upd_time.getTime);

   *new* *sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
 r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
 r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
 r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
 r(19).trim, r(20).trim, 

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread anamika gupta
I am now getting the following error. I cross-checked my types and
corrected three of them i.e. r26--String, r27--Timestamp,
r28--Timestamp. This error still persists.

scala sc.textFile(/home/cdhuser/Desktop/Sdp_d.csv).map(_.split(,)).map
{ r =
 | val upto_time = sdf.parse(r(23).trim);
 | calendar.setTime(upto_time);
 | val r23 = new java.sql.Timestamp(upto_time.getTime)
 | val insert_time = sdf.parse(r(27).trim)
 | calendar.setTime(insert_time)
 | val r27 = new java.sql.Timestamp(insert_time.getTime)
 | val last_upd_time = sdf.parse(r(28).trim)
 | calendar.setTime(last_upd_time)
 | val r28 = new java.sql.Timestamp(last_upd_time.getTime)
 | new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
r(25).trim, r(26).trim, r27, r28)
 | }.registerAsTable(sdp_d)

console:26: error: type mismatch;
 found   : Int
 required: Option[Int]
  new sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
r(25).trim, r(26).trim, r27, r28)

On Wed, Feb 25, 2015 at 2:32 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 It says sdp_d not found, since it is a class you need to instantiate it
 once. like:

 sc.textFile(derby.log).map(_.split(,)).map( r = {
   val upto_time = sdf.parse(r(23).trim);
   calendar.setTime(upto_time);
   val r23 = new java.sql.Timestamp(upto_time.getTime);

   val insert_time = sdf.parse(r(26).trim);
   calendar.setTime(insert_time);
   val r26 = new java.sql.Timestamp(insert_time.getTime);

   val last_upd_time = sdf.parse(r(27).trim);
   calendar.setTime(last_upd_time);
   val r27 = new java.sql.Timestamp(last_upd_time.getTime);

   *new* *sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
 r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
 r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
 r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
 r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
 r(25).trim, r26, r27, r(28).trim)*
   }).registerAsTable(sdp)

 Thanks
 Best Regards

 On Wed, Feb 25, 2015 at 2:14 PM, anamika gupta anamika.guo...@gmail.com
 wrote:

 The link has proved helpful. I have been able to load data, register it
 as a table and perform simple queries. Thanks Akhil !!

 Though, I still look forward to knowing where I was going wrong with my
 previous technique of extending the Product Interface to overcome case
 class's limit of 22 fields.

 On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta anamika.guo...@gmail.com
 wrote:

 Hi Akhil

 I guess it skipped my attention. I would definitely give it a try.

 While I would still like to know what is the issue with the way I have
 created schema?

 On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Did you happen to have a look at
 https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

 Thanks
 Best Regards

 On Tue, Feb 24, 2015 at 3:39 PM, anu anamika.guo...@gmail.com wrote:

 My issue is posted here on stack-overflow. What am I doing wrong here?


 http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi

 --
 View this message in context: Facing error while extending scala
 class with Product interface to overcome limit of 22 fields in spark-shell
 http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html
 Sent from the Apache Spark User List mailing list archive
 http://apache-spark-user-list.1001560.n3.nabble.com/ at Nabble.com.








Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread anamika gupta
The link has proved helpful. I have been able to load data, register it as
a table and perform simple queries. Thanks Akhil !!

Though, I still look forward to knowing where I was going wrong with my
previous technique of extending the Product Interface to overcome case
class's limit of 22 fields.

On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta anamika.guo...@gmail.com
wrote:

 Hi Akhil

 I guess it skipped my attention. I would definitely give it a try.

 While I would still like to know what is the issue with the way I have
 created schema?

 On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Did you happen to have a look at
 https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

 Thanks
 Best Regards

 On Tue, Feb 24, 2015 at 3:39 PM, anu anamika.guo...@gmail.com wrote:

 My issue is posted here on stack-overflow. What am I doing wrong here?


 http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi

 --
 View this message in context: Facing error while extending scala class
 with Product interface to overcome limit of 22 fields in spark-shell
 http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html
 Sent from the Apache Spark User List mailing list archive
 http://apache-spark-user-list.1001560.n3.nabble.com/ at Nabble.com.






Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread Akhil Das
It says sdp_d not found, since it is a class you need to instantiate it
once. like:

sc.textFile(derby.log).map(_.split(,)).map( r = {
  val upto_time = sdf.parse(r(23).trim);
  calendar.setTime(upto_time);
  val r23 = new java.sql.Timestamp(upto_time.getTime);

  val insert_time = sdf.parse(r(26).trim);
  calendar.setTime(insert_time);
  val r26 = new java.sql.Timestamp(insert_time.getTime);

  val last_upd_time = sdf.parse(r(27).trim);
  calendar.setTime(last_upd_time);
  val r27 = new java.sql.Timestamp(last_upd_time.getTime);

  *new* *sdp_d(r(0).trim.toInt, r(1).trim.toInt, r(2).trim,
r(3).trim.toInt, r(4).trim.toInt, r(5).trim, r(6).trim.toInt, r(7).trim,
r(8).trim.toDouble, r(9).trim.toDouble, r(10).trim, r(11).trim, r(12).trim,
r(13).trim, r(14).trim, r(15).trim, r(16).trim, r(17).trim, r(18).trim,
r(19).trim, r(20).trim, r(21).trim.toInt, r(22).trim, r23, r(24).trim,
r(25).trim, r26, r27, r(28).trim)*
  }).registerAsTable(sdp)

Thanks
Best Regards

On Wed, Feb 25, 2015 at 2:14 PM, anamika gupta anamika.guo...@gmail.com
wrote:

 The link has proved helpful. I have been able to load data, register it as
 a table and perform simple queries. Thanks Akhil !!

 Though, I still look forward to knowing where I was going wrong with my
 previous technique of extending the Product Interface to overcome case
 class's limit of 22 fields.

 On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta anamika.guo...@gmail.com
 wrote:

 Hi Akhil

 I guess it skipped my attention. I would definitely give it a try.

 While I would still like to know what is the issue with the way I have
 created schema?

 On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 Did you happen to have a look at
 https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

 Thanks
 Best Regards

 On Tue, Feb 24, 2015 at 3:39 PM, anu anamika.guo...@gmail.com wrote:

 My issue is posted here on stack-overflow. What am I doing wrong here?


 http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi

 --
 View this message in context: Facing error while extending scala class
 with Product interface to overcome limit of 22 fields in spark-shell
 http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html
 Sent from the Apache Spark User List mailing list archive
 http://apache-spark-user-list.1001560.n3.nabble.com/ at Nabble.com.







Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread Petar Zecevic


I believe your class needs to be defined as a case class (as I answered 
on SO)..



On 25.2.2015. 5:15, anamika gupta wrote:

Hi Akhil

I guess it skipped my attention. I would definitely give it a try.

While I would still like to know what is the issue with the way I have 
created schema?


On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das ak...@sigmoidanalytics.com 
mailto:ak...@sigmoidanalytics.com wrote:


Did you happen to have a look at

https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

Thanks
Best Regards

On Tue, Feb 24, 2015 at 3:39 PM, anu anamika.guo...@gmail.com
mailto:anamika.guo...@gmail.com wrote:

My issue is posted here on stack-overflow. What am I doing
wrong here?


http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi


View this message in context: Facing error while extending
scala class with Product interface to overcome limit of 22
fields in spark-shell

http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html
Sent from the Apache Spark User List mailing list archive
http://apache-spark-user-list.1001560.n3.nabble.com/ at
Nabble.com.







Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread Akhil Das
Did you happen to have a look at
https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

Thanks
Best Regards

On Tue, Feb 24, 2015 at 3:39 PM, anu anamika.guo...@gmail.com wrote:

 My issue is posted here on stack-overflow. What am I doing wrong here?


 http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi

 --
 View this message in context: Facing error while extending scala class
 with Product interface to overcome limit of 22 fields in spark-shell
 http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html
 Sent from the Apache Spark User List mailing list archive
 http://apache-spark-user-list.1001560.n3.nabble.com/ at Nabble.com.



Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread anamika gupta
Hi Akhil

I guess it skipped my attention. I would definitely give it a try.

While I would still like to know what is the issue with the way I have
created schema?

On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 Did you happen to have a look at
 https://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

 Thanks
 Best Regards

 On Tue, Feb 24, 2015 at 3:39 PM, anu anamika.guo...@gmail.com wrote:

 My issue is posted here on stack-overflow. What am I doing wrong here?


 http://stackoverflow.com/questions/28689186/facing-error-while-extending-scala-class-with-product-interface-to-overcome-limi

 --
 View this message in context: Facing error while extending scala class
 with Product interface to overcome limit of 22 fields in spark-shell
 http://apache-spark-user-list.1001560.n3.nabble.com/Facing-error-while-extending-scala-class-with-Product-interface-to-overcome-limit-of-22-fields-in-spl-tp21787.html
 Sent from the Apache Spark User List mailing list archive
 http://apache-spark-user-list.1001560.n3.nabble.com/ at Nabble.com.