[ https://issues.apache.org/jira/browse/SPARK-19810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15900548#comment-15900548 ]
Min Shen commented on SPARK-19810: ---------------------------------- [~srowen], Want to get an idea regarding the timeline for removing Scala 2.10. We have heavy usage of Spark at LinkedIn, and we are right now still deploying Spark built with Scala 2.10 due to various dependencies on other systems we have which still rely on Scala 2.10. While we also have plans to upgrade our various internal systems to start using Scala 2.11, it will take a while for that to happen. In the mean time, if support for Scala 2.10 is removed in Spark 2.2, this is going to potentially block us from upgrading to Spark 2.2+ while we haven't fully moved off Scala 2.10 yet. Want to raise this concern here and also to understand the timeline for removing Scala 2.10 in Spark. > Remove support for Scala 2.10 > ----------------------------- > > Key: SPARK-19810 > URL: https://issues.apache.org/jira/browse/SPARK-19810 > Project: Spark > Issue Type: Task > Components: ML, Spark Core, SQL > Affects Versions: 2.1.0 > Reporter: Sean Owen > Assignee: Sean Owen > Priority: Critical > > This tracks the removal of Scala 2.10 support, as discussed in > http://apache-spark-developers-list.1001551.n3.nabble.com/Straw-poll-dropping-support-for-things-like-Scala-2-10-td19553.html > and other lists. > The primary motivations are to simplify the code and build, and to enable > Scala 2.12 support later. -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org