As Scala 2.11 is the default for 2.4.x, we currently include _2.11 artifacts in 
our release.  Our Python library depends on the Scala artifacts.

With Homebrew, we have -submit and -shell script wrappers for spark-submit and 
spark-shell, and those will break at runtime if we're using _2.11 artifacts on 
Spark built for Scala 2.12.


> On Apr 26, 2019, at 11:12 AM, Sean Owen <sro...@gmail.com> wrote:
> 
> Yeah I don't think the pyspark change was intentional; I'm trying to help 
> assess what the impact is though. 
> 
> It may be a dumb question, but, what problem does the change cause? is it 
> beyond what I mentioned below? you have a project with interdependent Python 
> and Scala components?
> 
> On Fri, Apr 26, 2019 at 11:02 AM Michael Heuer <heue...@gmail.com 
> <mailto:heue...@gmail.com>> wrote:
> We certainly can't be the only project downstream of Spark that includes 
> Scala versioned artifacts in our release.  Our python library on PyPI depends 
> on pyspark, our Bioconda recipe depends on the pyspark Conda recipe, and our 
> Homebrew formula depends on the apache-spark Homebrew formula.
> 
> Using Scala 2.12 in the binary distribution for Spark 2.4.2 was unintentional 
> and never voted on.  There was a successful vote to default to Scala 2.12 in 
> Spark version 3.0.
> 
>    michael
> 
> 
>> On Apr 26, 2019, at 9:52 AM, Sean Owen <sro...@gmail.com 
>> <mailto:sro...@gmail.com>> wrote:
>> 
>> To be clear, what's the nature of the problem there... just Pyspark apps 
>> that are using a Scala-based library? Trying to make sure we understand what 
>> is and isn't a problem here.
> 

Reply via email to