Re: Java object serialization error, java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible

2022-08-26 Thread Elliot Metsger
Yep!  Thanks Robert for engaging on Slack!  Just had to dig in a bit - I
ended up building the 3.3.0 jobserver from scratch (the docker build env is
very slick, despite encountering a couple of hiccups), before I realized
the image was available on docker hub :facepalm: ...

On Fri, Aug 26, 2022 at 7:50 PM Robert Burke  wrote:

> Woohoo! Glad that was figured out.
>
> On Fri, Aug 26, 2022, 4:40 PM Elliot Metsger  wrote:
>
>>
>> So, it turns out this was a Spark version mismatch between the Beam
>> JobServer and the Spark platform.
>>
>> I'm running both Beam and Spark on Docker; the Spark image [0] provided
>> version 3.3.0 and Scala version 2.12, but I used
>> apache/beam_spark_job_server:2.41.0 [1] which provides Spark 2.4.x
>> libraries, Scala version 2.11.  Instead, I needed to use the
>> apache/beam_spark3_job_server:2.41.0 image [2], which provides Spark 3.3.x.
>>
>> [0] https://hub.docker.com/r/apache/spark/tags
>> [1] https://hub.docker.com/r/apache/beam_spark_job_server/tags
>> [2] https://hub.docker.com/r/apache/beam_spark3_job_server/tags
>>
>> On 2022/08/25 13:48:16 Elliot Metsger wrote:
>> > Howdy folks, super-new to Beam, and attempting to get a simple example
>> > working with Go, using the portable runner and Spark. There seems to be
>> an
>> > incompatibility between Java components, and I’m not quite sure where
>> the
>> > disconnect is, but at the root it seems to be an incompatibility with
>> > object serializations.
>> >
>> > When I submit the job via the go sdk, it errors out on the Spark side
>> with:
>> > [8:59 AM] 22/08/25 12:45:59 ERROR TransportRequestHandler: Error while
>> > invoking RpcHandler#receive() for one-way message.
>> > java.io.InvalidClassException:
>> > org.apache.spark.deploy.ApplicationDescription; local class
>> incompatible:
>> > stream classdesc serialVersionUID = 6543101073799644159, local class
>> > serialVersionUID = 1574364215946805297
>> > I’m using apache/beam_spark_job_server:2.41.0 and apache/spark:latest.
>> >  (docker-compose[0], hello world wordcount example pipeline[1]).
>> >
>> > Any ideas on where to look?  It looks like the Beam JobService is using
>> > Java 8 (?) and Spark is using Java 11.  I’ve tried downgrading Spark
>> from
>> > 3.3.0 to 3.1.3 (the earliest version for which Docker images are
>> > available), and downgrading to Beam 2.40.0 with no luck.
>> >
>> > This simple repo[2] should demonstrate the issue.  Any pointers would
>> be
>> > appreciated!
>> >
>> > [0]:
>> https://github.com/emetsger/beam-test/blob/develop/docker-compose.yml
>> > [1]:
>> >
>> https://github.com/emetsger/beam-test/blob/develop/debugging_wordcount.go
>> > [2]: https://github.com/emetsger/beam-test
>> >
>>
>


Re: Java object serialization error, java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible

2022-08-26 Thread Robert Burke
Woohoo! Glad that was figured out.

On Fri, Aug 26, 2022, 4:40 PM Elliot Metsger  wrote:

>
> So, it turns out this was a Spark version mismatch between the Beam
> JobServer and the Spark platform.
>
> I'm running both Beam and Spark on Docker; the Spark image [0] provided
> version 3.3.0 and Scala version 2.12, but I used
> apache/beam_spark_job_server:2.41.0 [1] which provides Spark 2.4.x
> libraries, Scala version 2.11.  Instead, I needed to use the
> apache/beam_spark3_job_server:2.41.0 image [2], which provides Spark 3.3.x.
>
> [0] https://hub.docker.com/r/apache/spark/tags
> [1] https://hub.docker.com/r/apache/beam_spark_job_server/tags
> [2] https://hub.docker.com/r/apache/beam_spark3_job_server/tags
>
> On 2022/08/25 13:48:16 Elliot Metsger wrote:
> > Howdy folks, super-new to Beam, and attempting to get a simple example
> > working with Go, using the portable runner and Spark. There seems to be
> an
> > incompatibility between Java components, and I’m not quite sure where the
> > disconnect is, but at the root it seems to be an incompatibility with
> > object serializations.
> >
> > When I submit the job via the go sdk, it errors out on the Spark side
> with:
> > [8:59 AM] 22/08/25 12:45:59 ERROR TransportRequestHandler: Error while
> > invoking RpcHandler#receive() for one-way message.
> > java.io.InvalidClassException:
> > org.apache.spark.deploy.ApplicationDescription; local class incompatible:
> > stream classdesc serialVersionUID = 6543101073799644159, local class
> > serialVersionUID = 1574364215946805297
> > I’m using apache/beam_spark_job_server:2.41.0 and apache/spark:latest.
> >  (docker-compose[0], hello world wordcount example pipeline[1]).
> >
> > Any ideas on where to look?  It looks like the Beam JobService is using
> > Java 8 (?) and Spark is using Java 11.  I’ve tried downgrading Spark
> from
> > 3.3.0 to 3.1.3 (the earliest version for which Docker images are
> > available), and downgrading to Beam 2.40.0 with no luck.
> >
> > This simple repo[2] should demonstrate the issue.  Any pointers would be
> > appreciated!
> >
> > [0]:
> https://github.com/emetsger/beam-test/blob/develop/docker-compose.yml
> > [1]:
> >
> https://github.com/emetsger/beam-test/blob/develop/debugging_wordcount.go
> > [2]: https://github.com/emetsger/beam-test
> >
>


RE: Java object serialization error, java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible

2022-08-26 Thread Elliot Metsger
So, it turns out this was a Spark version mismatch between the Beam
JobServer and the Spark platform.

I'm running both Beam and Spark on Docker; the Spark image [0] provided
version 3.3.0 and Scala version 2.12, but I used
apache/beam_spark_job_server:2.41.0 [1] which provides Spark 2.4.x
libraries, Scala version 2.11.  Instead, I needed to use the
apache/beam_spark3_job_server:2.41.0 image [2], which provides Spark 3.3.x.

[0] https://hub.docker.com/r/apache/spark/tags
[1] https://hub.docker.com/r/apache/beam_spark_job_server/tags
[2] https://hub.docker.com/r/apache/beam_spark3_job_server/tags

On 2022/08/25 13:48:16 Elliot Metsger wrote:
> Howdy folks, super-new to Beam, and attempting to get a simple example
> working with Go, using the portable runner and Spark. There seems to be an
> incompatibility between Java components, and I’m not quite sure where the
> disconnect is, but at the root it seems to be an incompatibility with
> object serializations.
>
> When I submit the job via the go sdk, it errors out on the Spark side
with:
> [8:59 AM] 22/08/25 12:45:59 ERROR TransportRequestHandler: Error while
> invoking RpcHandler#receive() for one-way message.
> java.io.InvalidClassException:
> org.apache.spark.deploy.ApplicationDescription; local class incompatible:
> stream classdesc serialVersionUID = 6543101073799644159, local class
> serialVersionUID = 1574364215946805297
> I’m using apache/beam_spark_job_server:2.41.0 and apache/spark:latest.
>  (docker-compose[0], hello world wordcount example pipeline[1]).
>
> Any ideas on where to look?  It looks like the Beam JobService is using
> Java 8 (?) and Spark is using Java 11.  I’ve tried downgrading Spark from
> 3.3.0 to 3.1.3 (the earliest version for which Docker images are
> available), and downgrading to Beam 2.40.0 with no luck.
>
> This simple repo[2] should demonstrate the issue.  Any pointers would be
> appreciated!
>
> [0]: https://github.com/emetsger/beam-test/blob/develop/docker-compose.yml
> [1]:
> https://github.com/emetsger/beam-test/blob/develop/debugging_wordcount.go
> [2]: https://github.com/emetsger/beam-test
>


Java object serialization error, java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible

2022-08-25 Thread Elliot Metsger
Howdy folks, super-new to Beam, and attempting to get a simple example
working with Go, using the portable runner and Spark. There seems to be an
incompatibility between Java components, and I’m not quite sure where the
disconnect is, but at the root it seems to be an incompatibility with
object serializations.

When I submit the job via the go sdk, it errors out on the Spark side with:
[8:59 AM] 22/08/25 12:45:59 ERROR TransportRequestHandler: Error while
invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException:
org.apache.spark.deploy.ApplicationDescription; local class incompatible:
stream classdesc serialVersionUID = 6543101073799644159, local class
serialVersionUID = 1574364215946805297
I’m using apache/beam_spark_job_server:2.41.0 and apache/spark:latest.
 (docker-compose[0], hello world wordcount example pipeline[1]).

Any ideas on where to look?  It looks like the Beam JobService is using
Java 8 (?) and Spark is using Java 11.  I’ve tried downgrading Spark from
3.3.0 to 3.1.3 (the earliest version for which Docker images are
available), and downgrading to Beam 2.40.0 with no luck.

This simple repo[2] should demonstrate the issue.  Any pointers would be
appreciated!

[0]: https://github.com/emetsger/beam-test/blob/develop/docker-compose.yml
[1]:
https://github.com/emetsger/beam-test/blob/develop/debugging_wordcount.go
[2]: https://github.com/emetsger/beam-test