[ 
https://issues.apache.org/jira/browse/SPARK-32051?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17146722#comment-17146722
 ] 

Jungtaek Lim edited comment on SPARK-32051 at 6/27/20, 1:42 AM:
----------------------------------------------------------------

OK so it needs to explicitly return Unit. Thanks for the information.

[~srowen] Do you have some idea around this? My guess is that Scala 2.12 
enables leveraging functional interface into lambda, and that leads confusions. 
I haven't played with Java side, so I'm not sure removing one of two methods 
would work, and even it works it may lead backward incompatibility. I have no 
idea with Scala 2.13 so a bit afraid to try out some changes.


was (Author: kabhwan):
OK so it needs to explicitly return Unit. Thanks for the information.

[~srowen] Do you have some idea around this? My guess is that Scala 2.12 
enables leveraging functional interface into lambda, and that leads confusions. 
I haven't played with Java side, so I'm not sure removing one of two methods 
would work, and even it works it may lead backward incompatibility.

> Dataset.foreachPartition returns object
> ---------------------------------------
>
>                 Key: SPARK-32051
>                 URL: https://issues.apache.org/jira/browse/SPARK-32051
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Frank Oosterhuis
>            Priority: Major
>
> I'm trying to map values from the Dataset[Row], but since 3.0.0 this fails.
> In 3.0.0 I'm dealing with an error: "Error:(28, 38) value map is not a member 
> of Object"
>  
> This is the simplest code that works in 2.4.x, but fails in 3.0.0:
> {code:scala}
> spark.range(100)
>   .repartition(10)
>   .foreachPartition(part => println(part.toList))
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to