There's actually some code-conditionalizations for 2.11 vs. 2.12 that would be 
able to go away if we get rid of 2.11, which would be an added benefit.

I would support getting rid of 2.11.

I did use an apache zeppelin notebook with Daffodil that needed 2.11. It was 
required to use 2.11 because it was integrated with an older revision of Apache 
spark which was still restricted to 2.11. So not only does Apache Spark have to 
have 2.12, but things like zeppelin that use Apache Spark must have moved on 
from the 2.11 to 2.12 version of spark.

I just checked, however, and the apache zeppelin source tree's spark folder has 
2.10, 2.11, and 2.12 subdirectories, so I think this not an issue.

________________________________
From: Interrante, John A (GE Research, US) <inter...@research.ge.com>
Sent: Wednesday, September 30, 2020 2:20 PM
To: dev@daffodil.apache.org <dev@daffodil.apache.org>
Subject: Can Daffodil drop support for Scala 2.11?

I'd like to ask the Apache Daffodil developers to weigh on this question:

                Does Daffodil need to run on Scala 2.11 anymore?

I've been told the only reason why we're still publishing Scala 2.11 builds is 
for Apache Spark, which had been working only on Scala 2.11 for a long time 
even after Scala 2.12 came out.  However, the Spark 2.4 releases have been 
running on both 2.11 and 2.12 since November 2018 and Scala 2.12 has become the 
default language for the Spark 3.0 releases.  In fact, the Spark 3.0 releases 
have removed support for 2.11 although they have not completed all of the 
changes needed to support Scala 2.13 yet.

Does anyone know of any reasons why Daffodil needs to continue building on 
Scala 2.11?  My motivation for asking is because my pull request uses an open 
source library called os-lib in the runtime2 backend but os-lib has not 
published any new Scala 2.11 builds since March 2019.

John

Reply via email to