Under core/src/test/scala/org/apache/spark , you will find a lot of
examples for map function.
FYI
On Mon, Oct 19, 2015 at 10:35 AM, Shepherd wrote:
> Hi all, I am new in Spark and Scala. I have a question in doing
> calculation. I am using "groupBy" to generate key value
Hi all, I am new in Spark and Scala. I have a question in doing calculation.I
am using "groupBy" to generate key value pair, and the value points to a
subset of original RDD. The RDD has four columns, and each subset RDD may
have different number of rows.For example, the original code like
Are you by any chance looking for reduceByKey? IF you’re trying to collapse all
the values in V into an aggregate, that’s what you should be looking at.
-adrian
From: Ted Yu
Date: Monday, October 19, 2015 at 9:16 PM
To: Shepherd
Cc: user
Subject: Re: How to calculate row by now and output