I fixed the bug, but I kept the parameter "i" instead of "_" since that (1)
keeps it more parallel to the python and java versions which also use
functions with a named variable and (2) doesn't require readers to know
this particular use of the "_" syntax in Scala.

Thanks for catching this Glenn.

Andy


On Fri, May 16, 2014 at 12:38 PM, Mark Hamstra <m...@clearstorydata.com>wrote:

> Sorry, looks like an extra line got inserted in there.  One more try:
>
> val count = spark.parallelize(1 to NUM_SAMPLES).map { _ =>
>   val x = Math.random()
>   val y = Math.random()
>   if (x*x + y*y < 1) 1 else 0
> }.reduce(_ + _)
>
>
>
> On Fri, May 16, 2014 at 12:36 PM, Mark Hamstra <m...@clearstorydata.com
> >wrote:
>
> > Actually, the better way to write the multi-line closure would be:
> >
> > val count = spark.parallelize(1 to NUM_SAMPLES).map { _ =>
> >
> >   val x = Math.random()
> >   val y = Math.random()
> >   if (x*x + y*y < 1) 1 else 0
> > }.reduce(_ + _)
> >
> >
> > On Fri, May 16, 2014 at 9:41 AM, GlennStrycker <glenn.stryc...@gmail.com
> >wrote:
> >
> >> On the webpage http://spark.apache.org/examples.html, there is an
> example
> >> written as
> >>
> >> val count = spark.parallelize(1 to NUM_SAMPLES).map(i =>
> >>   val x = Math.random()
> >>   val y = Math.random()
> >>   if (x*x + y*y < 1) 1 else 0
> >> ).reduce(_ + _)
> >> println("Pi is roughly " + 4.0 * count / NUM_SAMPLES)
> >>
> >> This does not execute in Spark, which gives me an error:
> >> <console>:2: error: illegal start of simple expression
> >>          val x = Math.random()
> >>          ^
> >>
> >> If I rewrite the query slightly, adding in {}, it works:
> >>
> >> val count = spark.parallelize(1 to 10000).map(i =>
> >>    {
> >>    val x = Math.random()
> >>    val y = Math.random()
> >>    if (x*x + y*y < 1) 1 else 0
> >>    }
> >> ).reduce(_ + _)
> >> println("Pi is roughly " + 4.0 * count / 10000.0)
> >>
> >>
> >>
> >>
> >>
> >> --
> >> View this message in context:
> >>
> http://apache-spark-developers-list.1001551.n3.nabble.com/Scala-examples-for-Spark-do-not-work-as-written-in-documentation-tp6593.html
> >> Sent from the Apache Spark Developers List mailing list archive at
> >> Nabble.com.
> >>
> >
> >
>

Reply via email to