[jira] [Commented] (SPARK-33077) Spark 3 / Cats 2.2.0 classpath issue

2021-06-11 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33077?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17362005#comment-17362005
 ] 

Guillaume Martres commented on SPARK-33077:
---

Breeze 1.2 was recently released and includes the upgrade to a non-milestone 
spire.

> Spark 3 / Cats 2.2.0 classpath issue
> 
>
> Key: SPARK-33077
> URL: https://issues.apache.org/jira/browse/SPARK-33077
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core, Spark Shell
>Affects Versions: 3.0.1
>Reporter: Sami Dalouche
>Priority: Major
>
> A small project with minimal dependencies as well as instructions on how to 
> reproduce the issue is available at:
> [https://github.com/samidalouche/spark3-cats220]
> Executing this code works fine with cats 2.1.1 but fails with cats 2.2.0, 
> which is quite surprising since the spark and cats dependencies are pretty 
> much distinct from each other.
>  
> {code:java}
> java.lang.NoSuchMethodError: 'void 
> cats.kernel.CommutativeSemigroup.$init$(cats.kernel.CommutativeSemigroup)'
>  at cats.UnorderedFoldable$$anon$1.(UnorderedFoldable.scala:78)
>  at cats.UnorderedFoldable$.(UnorderedFoldable.scala:78)
>  at cats.UnorderedFoldable$.(UnorderedFoldable.scala)
>  at cats.data.NonEmptyListInstances$$anon$2.(NonEmptyList.scala:539)
>  at cats.data.NonEmptyListInstances.(NonEmptyList.scala:539)
>  at cats.data.NonEmptyList$.(NonEmptyList.scala:458)
>  at cats.data.NonEmptyList$.(NonEmptyList.scala)
>  at catsspark.Boom$.assumeValid_$bang(boom.scala:19)
>  at catsspark.Boom$.boom(boom.scala:14)
>  ... 47 elided{code}
> Thanks in advance for looking into this.
> I submitted the same issue to cat's bug tracker: 
> https://github.com/typelevel/cats/issues/3628



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2021-05-17 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17346063#comment-17346063
 ] 

Guillaume Martres commented on SPARK-25075:
---

Note that the Scala 2.13 snapshots of Spark are usable from Scala 3 just like 
other Scala 2.13 libraries 
([https://scalacenter.github.io/scala-3-migration-guide/docs/compatibility/classpath.html]),
 I don't think there's a Jira on building Spark on Scala 3 but there was a 
thread on the mailing-list: 
[http://apache-spark-developers-list.1001551.n3.nabble.com/Scala-3-support-approach-td30316.html]

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34507) Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12

2021-03-10 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34507?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17299003#comment-17299003
 ] 

Guillaume Martres commented on SPARK-34507:
---

> In this release, which also has 2.11 vs 2.12 versions, the 2.12 version has 
> the same issue. I can't figure out whether a) that actually should work for 
> some reason or b) that actually doesn't work!

I think it's always been wrong but sbt has some logic to align the version of 
the scala-library/reflect/compiler dependencies which can mask the problem in 
most situations (see 
https://github.com/sbt/librarymanagement/blob/develop/ivy/src/main/scala/sbt/internal/librarymanagement/IvyScalaUtil.scala),
 but this logic doesn't kick in when using sbt-dotty as 
[https://github.com/vincenzobaz/spark-scala3] does. I don't know what happens 
in other build tools.

> Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12
> -
>
> Key: SPARK-34507
> URL: https://issues.apache.org/jira/browse/SPARK-34507
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Guillaume Martres
>Priority: Major
>
> Snapshots of Spark 3.2 built against Scala 2.13 are available at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,]
>  but they seem to depend on Scala 2.12. Specifically if I look at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210223.010629-29.pom]
>  I see:
> {code:java}
> 2.12.10
> 2.13 It looks like 
> [https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65]
>  needs to be updated to also change the `scala.version` and not just the 
> `scala.binary.version`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34507) Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12

2021-03-10 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34507?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17298970#comment-17298970
 ] 

Guillaume Martres commented on SPARK-34507:
---

[~srowen] This issue is about the artifacts published in 
[https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,]
 see 
[http://apache-spark-developers-list.1001551.n3.nabble.com/FYI-Scala-2-13-Maven-Artifacts-td30616.html|http://apache-spark-developers-list.1001551.n3.nabble.com/FYI-Scala-2-13-Maven-Artifacts-td30616.html,]

These appears to be published after running the script that modifies the POMs 
but my contention is that this script is incomplete and should also update 
`scala.version`

> Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12
> -
>
> Key: SPARK-34507
> URL: https://issues.apache.org/jira/browse/SPARK-34507
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Guillaume Martres
>Priority: Major
>
> Snapshots of Spark 3.2 built against Scala 2.13 are available at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,]
>  but they seem to depend on Scala 2.12. Specifically if I look at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210223.010629-29.pom]
>  I see:
> {code:java}
> 2.12.10
> 2.13 It looks like 
> [https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65]
>  needs to be updated to also change the `scala.version` and not just the 
> `scala.binary.version`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Reopened] (SPARK-34507) Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12

2021-03-10 Thread Guillaume Martres (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-34507?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Guillaume Martres reopened SPARK-34507:
---

Reopened now that we have an sbt project demonstrating the problem.

> Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12
> -
>
> Key: SPARK-34507
> URL: https://issues.apache.org/jira/browse/SPARK-34507
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Guillaume Martres
>Priority: Major
>
> Snapshots of Spark 3.2 built against Scala 2.13 are available at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,]
>  but they seem to depend on Scala 2.12. Specifically if I look at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210223.010629-29.pom]
>  I see:
> {code:java}
> 2.12.10
> 2.13 It looks like 
> [https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65]
>  needs to be updated to also change the `scala.version` and not just the 
> `scala.binary.version`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-34507) Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12

2021-02-25 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-34507?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17291311#comment-17291311
 ] 

Guillaume Martres commented on SPARK-34507:
---

> I think scala-2.13 profile should overrdie this property:

 
 * What decide which profile gets used?
 * If the profile is indeed used, then why does 
[https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65]
 also change the default value of scala.binary.version but not scala.version, 
shouldn't they match?

 

> are you really facing some issues when you are pulling Scala 2.13 artifacts 
> from the Maven?

 

Yes, here's a simple reproducer, using [coursier|https://get-coursier.io/] to 
fetch spark I get the following jars:
{code:java}
% cs fetch -r https://repository.apache.org/content/repositories/snapshots 
org.apache.spark:spark-tags_2.13:3.2.0-SNAPSHOT
/home/smarter/.cache/coursier/v1/https/repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-tags_2.13/3.2.0-SNAPSHOT/spark-tags_2.13-3.2.0-20210225.010548-31.jar
/home/smarter/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.12.10/scala-library-2.12.10.jar
/home/smarter/.cache/coursier/v1/https/repo1.maven.org/maven2/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar
 {code}
Coursier is used by sbt to resolve dependencies, so I assume this affects at 
least all sbt projects, but I just found out this is a known issue in coursier: 
[https://github.com/coursier/coursier/issues/973]

 

 

> Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12
> -
>
> Key: SPARK-34507
> URL: https://issues.apache.org/jira/browse/SPARK-34507
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 3.2.0
>Reporter: Guillaume Martres
>Priority: Major
>
> Snapshots of Spark 3.2 built against Scala 2.13 are available at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,]
>  but they seem to depend on Scala 2.12. Specifically if I look at 
> [https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210223.010629-29.pom]
>  I see:
> {code:java}
> 2.12.10
> 2.13 It looks like 
> [https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65]
>  needs to be updated to also change the `scala.version` and not just the 
> `scala.binary.version`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-34507) Spark artefacts built against Scala 2.13 incorrectly depend on Scala 2.12

2021-02-23 Thread Guillaume Martres (Jira)
Guillaume Martres created SPARK-34507:
-

 Summary: Spark artefacts built against Scala 2.13 incorrectly 
depend on Scala 2.12
 Key: SPARK-34507
 URL: https://issues.apache.org/jira/browse/SPARK-34507
 Project: Spark
  Issue Type: Sub-task
  Components: Build
Affects Versions: 3.2.0
Reporter: Guillaume Martres


Snapshots of Spark 3.2 built against Scala 2.13 are available at 
[https://repository.apache.org/content/repositories/snapshots/org/apache/spark/,]
 but they seem to depend on Scala 2.12. Specifically if I look at 
[https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210223.010629-29.pom]
 I see:
{code:java}
2.12.10
2.13https://github.com/apache/spark/blob/8f994cbb4a18558c2e81516ef1e339d9c8fa0d41/dev/change-scala-version.sh#L65]
 needs to be updated to also change the `scala.version` and not just the 
`scala.binary.version`.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2021-02-23 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17289104#comment-17289104
 ] 

Guillaume Martres commented on SPARK-25075:
---

I've opened https://issues.apache.org/jira/browse/SPARK-34507.

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2021-02-19 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17287109#comment-17287109
 ] 

Guillaume Martres commented on SPARK-25075:
---

[~dongjoon] I think something is wrong with the published snapshots, it seems 
to depend on both Scala 2.12 and Scala 2.13 artifacts, leading to crashes at 
runtime, and indeed if I look at 
[https://repository.apache.org/content/repositories/snapshots/org/apache/spark/spark-parent_2.13/3.2.0-SNAPSHOT/spark-parent_2.13-3.2.0-20210219.011324-25.pom]
 I see:

2.12.10

So I assume a config file wasn't updated somewhere.

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2021-02-11 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17283059#comment-17283059
 ] 

Guillaume Martres commented on SPARK-25075:
---

Looks like 2.13 snapshot builds are now available: 
http://apache-spark-developers-list.1001551.n3.nabble.com/FYI-Scala-2-13-Maven-Artifacts-td30616.html,
 thanks [~dongjoon] !

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2021-01-04 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17258218#comment-17258218
 ] 

Guillaume Martres commented on SPARK-25075:
---

Now that 2.13 support is basically complete, would it be possible to publish a 
preview release of spark 3.1 built against scala 2.13 on maven for testing 
purposes? Thanks!

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33285) Too many "Auto-application to `()` is deprecated." related compilation warnings

2020-12-07 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33285?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17245370#comment-17245370
 ] 

Guillaume Martres commented on SPARK-33285:
---

> But it seems that we can't suppress all compilation warnings, such as 
> auto-application, which is different from eta-zero and eta-sam

 

For warnings which don't have their own category, you can suppress them by 
matching the warning string with a regexp, `scalac -Wconf:help` gives the 
details of the syntax.

 

Also note that there exists a scalafix rule to automatically fix this warning, 
it's `ExplicitNonNullaryApply` in https://github.com/scala/scala-rewrites

> Too many "Auto-application to `()` is deprecated."  related compilation 
> warnings
> 
>
> Key: SPARK-33285
> URL: https://issues.apache.org/jira/browse/SPARK-33285
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Priority: Minor
>
> There are too many  "Auto-application to `()` is deprecated." related 
> compilation warnings when compile with Scala 2.13 like
> {code:java}
> [WARNING] [Warn] 
> /spark-src/core/src/test/scala/org/apache/spark/PartitioningSuite.scala:246: 
> Auto-application to `()` is deprecated. Supply the empty argument list `()` 
> explicitly to invoke method stdev,
> or remove the empty argument list from its definition (Java-defined methods 
> are exempt).
> In Scala 3, an unapplied method like this will be eta-expanded into a 
> function.
> {code}
> A lot of them, but it's easy to fix.
> If there is a definition as follows:
> {code:java}
> Class Foo {
>def bar(): Unit = {}
> }
> val foo = new Foo{code}
> Should be
> {code:java}
> foo.bar()
> {code}
> not
> {code:java}
> foo.bar {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33344) Fix Compilation warings of "multiarg infix syntax looks like a tuple and will be deprecated" in Scala 2.13

2020-11-15 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33344?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17232414#comment-17232414
 ] 

Guillaume Martres commented on SPARK-33344:
---

Note that this warning will actually not be displayed by default anymore 
starting with Scala 2.13.4 (which should be released soon): 
https://github.com/scala/scala/pull/9089

> Fix Compilation warings of "multiarg infix syntax looks like a tuple and will 
> be deprecated" in Scala 2.13
> --
>
> Key: SPARK-33344
> URL: https://issues.apache.org/jira/browse/SPARK-33344
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Priority: Minor
>
> There is a batch of compilation warnings in Scala 2.13 as follow:
> {code:java}
> [WARNING] [Warn] 
> /spark/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala:656: 
> multiarg infix syntax looks like a tuple and will be deprecated
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33285) Too many "Auto-application to `()` is deprecated." related compilation warnings

2020-11-04 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33285?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17226251#comment-17226251
 ] 

Guillaume Martres commented on SPARK-33285:
---

Note that Scala 2.13 has a configurable warning mechanism, making it possible 
to hide some warnings: [https://github.com/scala/scala/pull/8373,] this can be 
combined with {{-Xfatal-warnings}} to enforce a warning-free build without 
actually fixing all warnings.

> Too many "Auto-application to `()` is deprecated."  related compilation 
> warnings
> 
>
> Key: SPARK-33285
> URL: https://issues.apache.org/jira/browse/SPARK-33285
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Priority: Minor
>
> There are too many  "Auto-application to `()` is deprecated." related 
> compilation warnings when compile with Scala 2.13 like
> {code:java}
> [WARNING] [Warn] 
> /spark-src/core/src/test/scala/org/apache/spark/PartitioningSuite.scala:246: 
> Auto-application to `()` is deprecated. Supply the empty argument list `()` 
> explicitly to invoke method stdev,
> or remove the empty argument list from its definition (Java-defined methods 
> are exempt).
> In Scala 3, an unapplied method like this will be eta-expanded into a 
> function.
> {code}
> A lot of them, but it's easy to fix.
> If there is a definition as follows:
> {code:java}
> Class Foo {
>def bar(): Unit = {}
> }
> val foo = new Foo{code}
> Should be
> {code:java}
> foo.bar()
> {code}
> not
> {code:java}
> foo.bar {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-33285) Too many "Auto-application to `()` is deprecated." related compilation warnings

2020-11-04 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-33285?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17226040#comment-17226040
 ] 

Guillaume Martres commented on SPARK-33285:
---

{quote}
Similarly, there are many "symbol literal is degraded" warnings too, but this 
can only be fixed after Scala 2.12 is no longer supported
{quote}

 

Replacing {{'foo}} by {{Symbol("foo")}} will get rid of the warning and is 
compatible with all Scala versions.

> Too many "Auto-application to `()` is deprecated."  related compilation 
> warnings
> 
>
> Key: SPARK-33285
> URL: https://issues.apache.org/jira/browse/SPARK-33285
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 3.1.0
>Reporter: Yang Jie
>Priority: Minor
>
> There are too many  "Auto-application to `()` is deprecated." related 
> compilation warnings when compile with Scala 2.13 like
> {code:java}
> [WARNING] [Warn] 
> /spark-src/core/src/test/scala/org/apache/spark/PartitioningSuite.scala:246: 
> Auto-application to `()` is deprecated. Supply the empty argument list `()` 
> explicitly to invoke method stdev,
> or remove the empty argument list from its definition (Java-defined methods 
> are exempt).
> In Scala 3, an unapplied method like this will be eta-expanded into a 
> function.
> {code}
> A lot of them, but it's easy to fix.
> If there is a definition as follows:
> {code:java}
> Class Foo {
>def bar(): Unit = {}
> }
> val foo = new Foo{code}
> Should be
> {code:java}
> foo.bar()
> {code}
> not
> {code:java}
> foo.bar {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2020-06-27 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17146941#comment-17146941
 ] 

Guillaume Martres commented on SPARK-25075:
---

2.13.3 has just been released with a fix for the miscompilation issue I 
mentioned above.

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2020-05-15 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17108310#comment-17108310
 ] 

Guillaume Martres commented on SPARK-25075:
---

I've found what seems to be a miscompilation issue in Scala 2.13 with my branch 
of Spark, see https://github.com/scala/bug/issues/12002

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29292) Fix internal usages of mutable collection for Seq in 2.13

2020-05-03 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17098459#comment-17098459
 ] 

Guillaume Martres commented on SPARK-29292:
---

Right, so ideally you'd change Seq to collection.Seq for parameter types but 
not for result type, but that might be tricky to manage and it looks like that 
was already considered and dismissed in the discussion you linked to.

Going back to the issue with copying when using toSeq everywhere, one way to 
reduce that would be to use builders when possible: instead of creating an 
ArrayBuffer, filling it, then copying using toSeq, one can create a 
`Seq.newBuilder`, fill it, then calll `result` on it to get back an immutable 
Seq without copying. (This might be worse than using ArrayBuffer in 2.12, 
because the default builder will construct a List, ideally one would use 
immutable.ArraySeq.newBuilder to get something backed by a plain Array but that 
one doesn't exist in 2.12). If most of the collections being copied are small 
then this might not make a significant difference any way.

> Fix internal usages of mutable collection for Seq in 2.13
> -
>
> Key: SPARK-29292
> URL: https://issues.apache.org/jira/browse/SPARK-29292
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Sean R. Owen
>Assignee: Sean R. Owen
>Priority: Minor
>
> Kind of related to https://issues.apache.org/jira/browse/SPARK-27681, but a 
> simpler subset. 
> In 2.13, a mutable collection can't be returned as a 
> {{scala.collection.Seq}}. It's easy enough to call .toSeq on these as that 
> still works on 2.12.
> {code}
> [ERROR] [Error] 
> /Users/seanowen/Documents/spark_2.13/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:467:
>  type mismatch;
>  found   : Seq[String] (in scala.collection) 
>  required: Seq[String] (in scala.collection.immutable) 
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29292) Fix internal usages of mutable collection for Seq in 2.13

2020-05-02 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17098141#comment-17098141
 ] 

Guillaume Martres commented on SPARK-29292:
---

Seq in 2.13 is a scala.collection.immutable.Seq, which is a subtype of 
scala.collection.Seq, so if a method takes a scala.collection.Seq as parameter 
then it will accept both a 2.12 and a 2.13 plain Seq as input (on the other 
hand, if the result type of a method is a scala.collection.Seq, then it can't 
be assigned to a plain Seq in 2.13).

> Fix internal usages of mutable collection for Seq in 2.13
> -
>
> Key: SPARK-29292
> URL: https://issues.apache.org/jira/browse/SPARK-29292
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Sean R. Owen
>Assignee: Sean R. Owen
>Priority: Minor
>
> Kind of related to https://issues.apache.org/jira/browse/SPARK-27681, but a 
> simpler subset. 
> In 2.13, a mutable collection can't be returned as a 
> {{scala.collection.Seq}}. It's easy enough to call .toSeq on these as that 
> still works on 2.12.
> {code}
> [ERROR] [Error] 
> /Users/seanowen/Documents/spark_2.13/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:467:
>  type mismatch;
>  found   : Seq[String] (in scala.collection) 
>  required: Seq[String] (in scala.collection.immutable) 
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29292) Fix internal usages of mutable collection for Seq in 2.13

2020-05-02 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17098074#comment-17098074
 ] 

Guillaume Martres commented on SPARK-29292:
---

(by the way, adding "import scala.collection.Seq" at the top of a file is a 
simple way to change all the occurences of "Seq" in that file to behave the 
same in 2.12 and 2.13)

> Fix internal usages of mutable collection for Seq in 2.13
> -
>
> Key: SPARK-29292
> URL: https://issues.apache.org/jira/browse/SPARK-29292
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Sean R. Owen
>Assignee: Sean R. Owen
>Priority: Minor
>
> Kind of related to https://issues.apache.org/jira/browse/SPARK-27681, but a 
> simpler subset. 
> In 2.13, a mutable collection can't be returned as a 
> {{scala.collection.Seq}}. It's easy enough to call .toSeq on these as that 
> still works on 2.12.
> {code}
> [ERROR] [Error] 
> /Users/seanowen/Documents/spark_2.13/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:467:
>  type mismatch;
>  found   : Seq[String] (in scala.collection) 
>  required: Seq[String] (in scala.collection.immutable) 
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29292) Fix internal usages of mutable collection for Seq in 2.13

2020-05-02 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17098066#comment-17098066
 ] 

Guillaume Martres commented on SPARK-29292:
---

> as the change will otherwise be very hard to manage across 2.12 vs 2.13

It's not trivial but it might not be that bad: in 2.12 "scala.Seq" and 
"scala.collection.Seq" are the same type, so you can replace one by the other 
without breaking anything while changing the behavior of Scala 2.13 to be more 
in line with 2.12.

> Fix internal usages of mutable collection for Seq in 2.13
> -
>
> Key: SPARK-29292
> URL: https://issues.apache.org/jira/browse/SPARK-29292
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Sean R. Owen
>Assignee: Sean R. Owen
>Priority: Minor
>
> Kind of related to https://issues.apache.org/jira/browse/SPARK-27681, but a 
> simpler subset. 
> In 2.13, a mutable collection can't be returned as a 
> {{scala.collection.Seq}}. It's easy enough to call .toSeq on these as that 
> still works on 2.12.
> {code}
> [ERROR] [Error] 
> /Users/seanowen/Documents/spark_2.13/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:467:
>  type mismatch;
>  found   : Seq[String] (in scala.collection) 
>  required: Seq[String] (in scala.collection.immutable) 
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29292) Fix internal usages of mutable collection for Seq in 2.13

2020-04-30 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17096893#comment-17096893
 ] 

Guillaume Martres commented on SPARK-29292:
---

If it's immutable it's fine yeah, but it seems that spark internally uses a 
bunch of ArrayBuffer which do need to be copied to be made into a scala.Seq 
now. On top of that, users of Spark might also have to add a bunch of 
potentially-copying .toSeq to call Spark methods. For example I had some code 
that did `sparkContext.parallelize(rdd.take(1000))` which still compiles but 
with a deprecation warning because take returns an Array:

> warning: method copyArrayToImmutableIndexedSeq in class LowPriorityImplicits2 
> is deprecated (since 2.13.0): Implicit conversions from Array to 
> immutable.IndexedSeq are implemented by copying; Use the more efficient 
> non-copying ArraySeq.unsafeWrapArray or an explicit toIndexedSeq call

So depending on how common this sort of things is, it might make sense to 
change SparkContext#parallelize  and others to take a scala.collection.Seq 
instead of a Seq.

> Fix internal usages of mutable collection for Seq in 2.13
> -
>
> Key: SPARK-29292
> URL: https://issues.apache.org/jira/browse/SPARK-29292
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Sean R. Owen
>Assignee: Sean R. Owen
>Priority: Minor
>
> Kind of related to https://issues.apache.org/jira/browse/SPARK-27681, but a 
> simpler subset. 
> In 2.13, a mutable collection can't be returned as a 
> {{scala.collection.Seq}}. It's easy enough to call .toSeq on these as that 
> still works on 2.12.
> {code}
> [ERROR] [Error] 
> /Users/seanowen/Documents/spark_2.13/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:467:
>  type mismatch;
>  found   : Seq[String] (in scala.collection) 
>  required: Seq[String] (in scala.collection.immutable) 
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29292) Fix internal usages of mutable collection for Seq in 2.13

2020-04-30 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17096876#comment-17096876
 ] 

Guillaume Martres commented on SPARK-29292:
---

Thanks [~srowen], I got spark to compile with Scala 2.13.2 based on this 
branch, cf 
https://issues.apache.org/jira/browse/SPARK-25075?focusedCommentId=17096870=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17096870.
 But I think that the use of .toSeq should be reconsidered since it means 
copying on Scala 2.13. Instead, I think that usages of scala.Seq should be 
replaced by scala.collection.Seq when it makes sense to do so.

> Fix internal usages of mutable collection for Seq in 2.13
> -
>
> Key: SPARK-29292
> URL: https://issues.apache.org/jira/browse/SPARK-29292
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Sean R. Owen
>Assignee: Sean R. Owen
>Priority: Minor
>
> Kind of related to https://issues.apache.org/jira/browse/SPARK-27681, but a 
> simpler subset. 
> In 2.13, a mutable collection can't be returned as a 
> {{scala.collection.Seq}}. It's easy enough to call .toSeq on these as that 
> still works on 2.12.
> {code}
> [ERROR] [Error] 
> /Users/seanowen/Documents/spark_2.13/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:467:
>  type mismatch;
>  found   : Seq[String] (in scala.collection) 
>  required: Seq[String] (in scala.collection.immutable) 
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-29292) Fix internal usages of mutable collection for Seq in 2.13

2020-04-30 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17096876#comment-17096876
 ] 

Guillaume Martres edited comment on SPARK-29292 at 4/30/20, 6:47 PM:
-

Thanks [~srowen], I got spark to compile with Scala 2.13.2 based on this 
branch, cf 
https://issues.apache.org/jira/browse/SPARK-25075?focusedCommentId=17096870=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17096870.
 But I think that the use of .toSeq should be reconsidered since it means 
copying on Scala 2.13 for mutable collections. Instead, I think that usages of 
scala.Seq should be replaced by scala.collection.Seq when it makes sense to do 
so.


was (Author: smarter):
Thanks [~srowen], I got spark to compile with Scala 2.13.2 based on this 
branch, cf 
https://issues.apache.org/jira/browse/SPARK-25075?focusedCommentId=17096870=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-17096870.
 But I think that the use of .toSeq should be reconsidered since it means 
copying on Scala 2.13. Instead, I think that usages of scala.Seq should be 
replaced by scala.collection.Seq when it makes sense to do so.

> Fix internal usages of mutable collection for Seq in 2.13
> -
>
> Key: SPARK-29292
> URL: https://issues.apache.org/jira/browse/SPARK-29292
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Sean R. Owen
>Assignee: Sean R. Owen
>Priority: Minor
>
> Kind of related to https://issues.apache.org/jira/browse/SPARK-27681, but a 
> simpler subset. 
> In 2.13, a mutable collection can't be returned as a 
> {{scala.collection.Seq}}. It's easy enough to call .toSeq on these as that 
> still works on 2.12.
> {code}
> [ERROR] [Error] 
> /Users/seanowen/Documents/spark_2.13/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:467:
>  type mismatch;
>  found   : Seq[String] (in scala.collection) 
>  required: Seq[String] (in scala.collection.immutable) 
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2020-04-30 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17096870#comment-17096870
 ] 

Guillaume Martres commented on SPARK-25075:
---

Based on [~srowen]'s branch from SPARK-29292, I was able to get spark-core and 
its dependencies to compile with Scala 2.13.2 after making a few changes. The 
result is at https://github.com/smarter/spark/tree/scala-2.13, I've published 
it at https://bintray.com/smarter/maven for my own needs but I don't intend to 
update it or work on it further.

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25075) Build and test Spark against Scala 2.13

2020-04-23 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-25075?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17090582#comment-17090582
 ] 

Guillaume Martres commented on SPARK-25075:
---

> SPARK-30132 which is blocked on Scala 2.13.2

2.13.2 is out now.

> SPARK-27683 and SPARK-30090 sound like non-negligible effort as well

The first one isn't actually a blocker, as noted in that issue, TraversableOnce 
is still there (as an alias) in 2.13

> Build and test Spark against Scala 2.13
> ---
>
> Key: SPARK-25075
> URL: https://issues.apache.org/jira/browse/SPARK-25075
> Project: Spark
>  Issue Type: Umbrella
>  Components: Build, MLlib, Project Infra, Spark Core, SQL
>Affects Versions: 3.0.0
>Reporter: Guillaume Massé
>Priority: Major
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.13 milestone.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-9621) Closure inside RDD doesn't properly close over environment

2020-04-14 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-9621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17083639#comment-17083639
 ] 

Guillaume Martres commented on SPARK-9621:
--

> unfixed as of scala 2.12.10 / 2.13.1

... but fixed in Dotty, so it's really not a great idea to rely on that.

> Closure inside RDD doesn't properly close over environment
> --
>
> Key: SPARK-9621
> URL: https://issues.apache.org/jira/browse/SPARK-9621
> Project: Spark
>  Issue Type: Bug
>Affects Versions: 1.4.1
> Environment: Ubuntu 15.04, spark-1.4.1-bin-hadoop2.6 package
>Reporter: Joe Near
>Priority: Major
>
> I expect the following:
> case class MyTest(i: Int)
> val tv = MyTest(1)
> val res = sc.parallelize(Array((t: MyTest) => t == tv)).first()(tv)
> to be "true." It is "false," when I type this into spark-shell. It seems the 
> closure is changed somehow when it's serialized and deserialized.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-29292) Fix internal usages of mutable collection for Seq in 2.13

2020-02-07 Thread Guillaume Martres (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-29292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17032671#comment-17032671
 ] 

Guillaume Martres commented on SPARK-29292:
---

> I can try updating my fork and pushing the commits to a branch, for 
> evaluation in _2.12_ at least. 

I'd be interested in seeing this branch too.

>  I just wasn't bothering until it seemed like 2.13 support blockers were 
> removed

What are the other blockers for 2.13 support ?

> Fix internal usages of mutable collection for Seq in 2.13
> -
>
> Key: SPARK-29292
> URL: https://issues.apache.org/jira/browse/SPARK-29292
> Project: Spark
>  Issue Type: Sub-task
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: Sean R. Owen
>Assignee: Sean R. Owen
>Priority: Minor
>
> Kind of related to https://issues.apache.org/jira/browse/SPARK-27681, but a 
> simpler subset. 
> In 2.13, a mutable collection can't be returned as a 
> {{scala.collection.Seq}}. It's easy enough to call .toSeq on these as that 
> still works on 2.12.
> {code}
> [ERROR] [Error] 
> /Users/seanowen/Documents/spark_2.13/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:467:
>  type mismatch;
>  found   : Seq[String] (in scala.collection) 
>  required: Seq[String] (in scala.collection.immutable) 
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org