The best option certainly would be to recompile the Spark Connector for
MS SQL server using the Spark 3.0.1/Scala 2.12 dependencies, and just
fix the compiler errors as you go. The code is open source on github
(https://github.com/microsoft/sql-spark-connector). Looks like this
connector is
I would suggest to ask microsoft and databricks, this forum is for apache
spark.
if you are interested please drop me a note separately as I m keen to
understand the issue as we use same setup
Ayan
On Mon, 26 Oct 2020 at 11:53 pm, wrote:
> Hi,
>
>
>
> In a project where I work with
Hi,
In a project where I work with Databricks, we use this connector to read /
write data to Azure SQL Database. Currently with Spark 2.4.5 and Scala 2.11.
But those setups are getting old. What happens if we update Spark to 3.0.1 or
higher and Scala 2.12.
This connector does not work