I forgot… it does the same thing with the reducer…

    int dartsInCircle = dotsDs.reduce((x, y) -> x + y);

jg

> On Dec 28, 2019, at 12:38 PM, Jean-Georges Perrin <j...@jgp.net> wrote:
> 
> Hey guys,
> 
> This code:
> 
>     Dataset<Row> incrementalDf = spark
>         .createDataset(l, Encoders.INT())
>         .toDF();
>     Dataset<Integer> dotsDs = incrementalDf
>         .map(status -> {
>           double x = Math.random() * 2 - 1;
>           double y = Math.random() * 2 - 1;
>           counter++;
>           if (counter % 100000 == 0) {
>             System.out.println("" + counter + " darts thrown so far");
>           }
>           return (x * x + y * y <= 1) ? 1 : 0;
>         }, Encoders.INT());
> 
> used to work with Spark 2.x, in the two previous, it says:
> 
> The method map(Function1<Row,Integer>, Encoder<Integer>) is ambiguous for the 
> type Dataset<Row>
> 
> IfI define my mapping function as a class it works fine. Here is the class:
> 
>   private final class DartMapper
>       implements MapFunction<Row, Integer> {
>     private static final long serialVersionUID = 38446L;
> 
>     @Override
>     public Integer call(Row r) throws Exception {
>       double x = Math.random() * 2 - 1;
>       double y = Math.random() * 2 - 1;
>       counter++;
>       if (counter % 1000 == 0) {
>         System.out.println("" + counter + " operations done so far");
>       }
>       return (x * x + y * y <= 1) ? 1 : 0;
>     }
>   }
> 
> Any hint on what/if I did wrong? 
> 
> jg
> 
> 
> 

Reply via email to