Hi,

On Mon, May 9, 2022 at 5:57 PM Shay Elbaz <shay.el...@gm.com> wrote:

> Hi all,
>
>
>
> I apologize for reposting this from Stack Overflow, but it got very little
> attention and now comment.
>
>
>
> I'm using Spark 3.2.1 image that was built from the official distribution
> via `docker-image-tool.sh', on Kubernetes 1.18 cluster.
>
> Everything works fine, except for this error message on stdout every 90
> seconds:
>

Wild guess: K8S API polling ?!

https://spark.apache.org/docs/latest/running-on-kubernetes.html#spark-properties

- spark.kubernetes.executor.apiPollingInterval
- spark.kubernetes.executor.missingPodDetectDelta

but for both settings the default is 30s, not 90s



>
>
> WARN WatcherWebSocketListener: Exec Failure
>
> java.io.EOFException
>
>     at okio.RealBufferedSource.require(RealBufferedSource.java:61)
>
>     at okio.RealBufferedSource.readByte(RealBufferedSource.java:74)
>
>     at
> okhttp3.internal.ws.WebSocketReader.readHeader(WebSocketReader.java:117)
>
>     at
> okhttp3.internal.ws.WebSocketReader.processNextFrame(WebSocketReader.java:101)
>
>     at okhttp3.internal.ws.RealWebSocket.loopReader(RealWebSocket.java:274)
>
>     at
> okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:214)
>
>     at okhttp3.RealCall$AsyncCall.execute(RealCall.java:203)
>
>     at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
>
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>
>     at java.lang.Thread.run(Thread.java:748)
>
>
>
> This message does not effect the application, but it's really annoying,
> and especially for Jupyter users. The lack of details makes it very hard to
> debug.
>
> It appears on any submit variation - spark-submit, pyspark, spark-shell.
>
> I've found traces of it on the internet, but all occurrences were from
> older versions of Spark and were resolved by using "newer" version of
> fabric8 (4.x).
>
> Spark 3.2.1 already use fabric8 version 5.4.1.
>
> I wonder if anyone else still sees this error in Spark 3.x, and has a
> resolution.
>
>
>
> Thanks,
>
> Shay.
>

Reply via email to