Hi Felix,
Just runned your code and it prints
Pi is roughly 4.0
Here is the code that I used as you didn't show what a random is I used the
nextInt()
val n = math.min(10L * slices, Int.MaxValue).toInt // avoid overflow
val count = context.sparkContext.parallelize(1 until n, slices).map
Dear Dirceu,
Below is our testing codes, as you can see, we have used "reduce" action to
evoke evaluation. However, it still did not stop at breakpoint-1(as shown in
the the code snippet) when debugging.
We are using IDEA version 14.0.3 to debug. It very very strange to us. Please
help
Sorry, it wasn't the count it was the reduce method that retrieves
information from the RDD.
I has to go through all the rdd values to return the result.
2016-09-16 11:18 GMT-03:00 chen yong :
> Dear Dirceu,
>
>
> I am totally confused . In your reply you mentioned ".the
Also, I wonder what is the right way to debug spark program. If I use ten
anonymous function in one spark program, for debugging each of them, i have to
place a COUNT action in advace and then remove it after debugging. Is that the
right way?
发件人: Dirceu
Dear Dirceu,
I am totally confused . In your reply you mentioned ".the count does that,
..." .However, in the code snippet shown in the attachment file
FelixProblem.png of your previous mail, I cannot find any 'count' ACTION is
called. Would you please clearly show me the line it is