+1, but adding 2.13 and dropping 2.11 should be at the same time, so
that we have the spark connector all the time.
—————————————————
Jialin Qiao
Apache IoTDB PMC

Christofer Dutz <christofer.d...@c-ware.de> 于2023年10月17日周二 15:36写道:
>
> Hi all,
>
> while working on the dependency cleanup, I noticed that we were stuck with 
> some pretty old spark versions in the spark_2.11 module.
> The reason is that the scala 2.11 versions have been end of life for quite 
> some time. Spark is now generally working with scala 2.12 and since version 
> 3.2 also 2.13.
>
> We have one reported CVE that we actually can’t get rid of in the scala_2.11 
> module, as there will be no release with a fix.
>
> I therefore propose we drop the 2.11 scala spark plugin and possibly add a 
> 2.13 version instead.
>
> What do you folks think?
>
> Chris
>

Reply via email to