)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Thanks
[Image removed by sender.]<http://adobe.com/>
Ravi Ag
[mailto:ianoconn...@gmail.com] On Behalf Of Ian
O'Connell
Sent: Wednesday, July 20, 2016 11:05 PM
To: Ravi Aggarwal <raagg...@adobe.com>
Cc: Ted Yu <yuzhih...@gmail.com>; user <user@spark.apache.org>
Subject: Re: OutOfMemory when doing joins in spark 2.0 while same code runs
fine in spa
t-merge join.
Can we deduce anything from this?
Thanks
Ravi
From: Ravi Aggarwal
Sent: Friday, June 10, 2016 12:31 PM
To: 'Ted Yu' <yuzhih...@gmail.com>
Cc: user <user@spark.apache.org>
Subject: RE: OutOfMemory when doing joins in spark 2.0 while same code runs
fine in spark 1.5.2
Hi T
CatalystTypeConverters.convertToScala(
Cast(Literal(value._2), colDataType).eval(),
colDataType)
}).toArray
Row(recordFields: _*)
}
rowRdd
}
}
Thanks
Ravi
From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Thursday, June 9, 2016 7:56 PM
To: Ravi Aggarwal <ra