To be clear, what's the nature of the problem there... just Pyspark apps
that are using a Scala-based library? Trying to make sure we understand
what is and isn't a problem here.

On Fri, Apr 26, 2019 at 9:44 AM Michael Heuer <heue...@gmail.com> wrote:

> This will also cause problems in Conda builds that depend on pyspark
>
> https://anaconda.org/conda-forge/pyspark
>
> and Homebrew builds that depend on apache-spark, as that also uses the
> binary distribution.
>
> https://formulae.brew.sh/formula/apache-spark#default
>
> +1 (non-binding) to cutting a 2.4.3 release immediately.
>
>    michael
>
>
> On Apr 26, 2019, at 2:05 AM, Reynold Xin <r...@databricks.com> wrote:
>
> I do feel it'd be better to not switch default Scala versions in a minor
> release. I don't know how much downstream this impacts. Dotnet is a good
> data point. Anybody else hit this issue?
>
>
>
>
> On Thu, Apr 25, 2019 at 11:36 PM, Terry Kim <yumin...@gmail.com> wrote:
>
>> Very much interested in hearing what you folks decide. We currently have
>> a couple asking us questions at https://github.com/dotnet/spark/issues.
>>
>> Thanks,
>> Terry
>>
>> --
>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>>
>> --------------------------------------------------------------------- To
>> unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>
>
>

Reply via email to