guys,
I was trying to figure out some counters in Spark, related to the amount
of CPU or Memory used (in some metric), used by a task/stage/job, but I
could not find any.
Is there any such counter available ?
Thank you,
Robert
Guys,
Do you have any thoughts on this ?
Thanks,Robert
On Sunday, April 12, 2015 5:35 PM, Grandl Robert
rgra...@yahoo.com.INVALID wrote:
Hi guys,
I was trying to figure out some counters in Spark, related to the amount of CPU
or Memory used (in some metric), used by a task/stage
Hi guys,
I was trying to figure out some counters in Spark, related to the amount of CPU
or Memory used (in some metric), used by a task/stage/job, but I could not find
any.
Is there any such counter available ?
Thank you,Robert
/docs/1.2.0/programming-guide.html#transformations
http://spark.apache.org/docs/1.2.0/programming-guide.html#actions
Cheers,
Sean
On Feb 13, 2015, at 9:50 AM, nitinkak001 nitinkak...@gmail.com wrote:
I am trying to implement counters in Spark and I guess Accumulators are the
way to do
I am trying to implement counters in Spark and I guess Accumulators are the
way to do it.
My motive is to update a counter in map function and access/reset it in the
driver code. However the /println/ statement at the end still yields value
0(It should 9). Am I doing something wrong?
def main
are:
http://spark.apache.org/docs/1.2.0/programming-guide.html#transformations
http://spark.apache.org/docs/1.2.0/programming-guide.html#actions
Cheers,
Sean
On Feb 13, 2015, at 9:50 AM, nitinkak001
nitinkak...@gmail.commailto:nitinkak...@gmail.com wrote:
I am trying to implement counters in Spark
and actions are:
http://spark.apache.org/docs/1.2.0/programming-guide.html#transformations
http://spark.apache.org/docs/1.2.0/programming-guide.html#actions
Cheers,
Sean
On Feb 13, 2015, at 9:50 AM, nitinkak001 nitinkak...@gmail.com wrote:
I am trying to implement counters in Spark