I was able to get it working. It needed a SparkSession to be instantiated
and wait for termination signal from the user. In my case I used a
StreamingContext -
https://spark.apache.org/docs/2.2.0/api/java/org/apache/spark/streaming/StreamingContext.html

Pramod Biligiri

On Sun, Aug 7, 2022 at 9:59 AM Pramod Biligiri <pramodbilig...@gmail.com>
wrote:

> Hi,
>
> I have a simple Java program that reads messages from a Google Cloud
> Pubsub topic and prints them. It works correctly when I run the program as
> a standalone, but it fails to receive messages when run using spark-submit.
> It connects to the subscription using my authentication credentials, but
> doesn't receive any messages post that.
>
>    1. Do programs launched using spark-submit have to follow a different
>    structure in general? My program doesn't do any Spark related stuff as of
>    now, but I'll be adding it later.
>    2. Are there working examples of Spark + Cloud Pubsub integration? I
>    came across a library called Apache Bahir, but is it a must to use a
>    library like that?
>
> The code for my example can be found here:
> https://github.com/pramodbiligiri/pubsub-spark
>
> Pramod Biligiri
>

Reply via email to