So, it turns out this was a Spark version mismatch between the Beam
JobServer and the Spark platform. Thanks for the pointer!
I'm running both Beam and Spark on Docker; the Spark image [0] provided
version 3.3.0 and Scala version 2.12, but I used
apache/beam_spark_job_server:2.41.0 [1] which
This suggests you have mixed two versions of Spark libraries. You probably
packaged Spark itself in your Spark app?
On Thu, Aug 25, 2022 at 4:56 PM Elliot Metsger wrote:
> Elliot Metsger
> 9:48 AM (7 hours ago)
> to dev
> Howdy folks,
>
> Relative newbie to Spark, and super new to Beam. (I've
Elliot Metsger
9:48 AM (7 hours ago)
to dev
Howdy folks,
Relative newbie to Spark, and super new to Beam. (I've asked this
question on Beam lists, but this seems like a Spark-related issue so I'm
trying my query here, too). I'm attempting to get a simple Beam pipeline
(using the Go SDK)