Re: Measuer Bytes READ and Peak Memory Usage for Query

2015-03-24 Thread anamika gupta
Yeah thanks, I can now see the memory usage. Please also verify if bytes read == Combined size of all RDDs ? Actually, all my RDDs are completely cached in memory. So, Combined size of my RDDs = Mem used (verified from WebUI) On Fri, Mar 20, 2015 at 12:07 PM, Akhil Das

Re: Optimizing SQL Query

2015-03-09 Thread anamika gupta
Please fine the query plan scala sqlContext.sql(SELECT dw.DAY_OF_WEEK, dw.HOUR, avg(dw.SDP_USAGE) AS AVG_SDP_USAGE FROM (SELECT sdp.WID, DAY_OF_WEEK, HOUR, SUM(INTERVAL_VALUE) AS SDP_USAGE FROM (SELECT * FROM date_d AS dd JOIN interval_f AS intf ON intf.DATE_WID = dd.WID WHERE intf.DATE_WID =

Re: Facing error: java.lang.ArrayIndexOutOfBoundsException while executing SparkSQL join query

2015-02-28 Thread anamika gupta
The issue is now resolved. One of the csv files had an incorrect record at the end. On Fri, Feb 27, 2015 at 4:24 PM, anamika gupta anamika.guo...@gmail.com wrote: I have three tables with the following schema: case class* date_d*(WID: Int, CALENDAR_DATE: java.sql.Timestamp, DATE_STRING

Facing error: java.lang.ArrayIndexOutOfBoundsException while executing SparkSQL join query

2015-02-27 Thread anamika gupta
I have three tables with the following schema: case class* date_d*(WID: Int, CALENDAR_DATE: java.sql.Timestamp, DATE_STRING: String, DAY_OF_WEEK: String, DAY_OF_MONTH: Int, DAY_OF_YEAR: Int, END_OF_MONTH_FLAG: String, YEARWEEK: Int, CALENDAR_MONTH: String, MONTH_NUM: Int, YEARMONTH: Int, QUARTER:

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-26 Thread anamika gupta
http://www.scala-lang.org/docu/files/ScalaReference.pdf and in Programming in Scala http://www.artima.com/shop/programming_in_scala_2ed ch 15. Hope that helps! Best, Patrick On Thu, Feb 26, 2015 at 6:37 AM, anamika gupta anamika.guo...@gmail.com wrote: I am now getting the following

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread anamika gupta
)* }).registerAsTable(sdp) Thanks Best Regards On Wed, Feb 25, 2015 at 2:14 PM, anamika gupta anamika.guo...@gmail.com wrote: The link has proved helpful. I have been able to load data, register it as a table and perform simple queries. Thanks Akhil !! Though, I still look forward to knowing where I

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-25 Thread anamika gupta
. On Wed, Feb 25, 2015 at 9:45 AM, anamika gupta anamika.guo...@gmail.com wrote: Hi Akhil I guess it skipped my attention. I would definitely give it a try. While I would still like to know what is the issue with the way I have created schema? On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das ak

Re: Facing error while extending scala class with Product interface to overcome limit of 22 fields in spark-shell

2015-02-24 Thread anamika gupta
Hi Akhil I guess it skipped my attention. I would definitely give it a try. While I would still like to know what is the issue with the way I have created schema? On Tue, Feb 24, 2015 at 4:35 PM, Akhil Das ak...@sigmoidanalytics.com wrote: Did you happen to have a look at