Re: Spark 3 pod template for the driver

2020-06-29 Thread edeesis
If I could muster a guess, you still need to specify the executor image. As
is, this will only specify the driver image.

You can specify it as --conf spark.kubernetes.container.image or --conf
spark.kubernetes.executor.container.image



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Getting the ball started on a 2.4.6 release

2020-04-24 Thread edeesis
Yes, watching the pod yaml could work for this. Just gotta set up some kind
of thing to do that, thanks for clueing me into that.

And sounds great re: Spark 2.5. Having a transitional release makes sense I
think.



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Getting the ball started on a 2.4.6 release

2020-04-23 Thread edeesis
There's other information you can obtain from the Pod metadata on a describe
than just from the logs, which are typically what's being printed by the
Application itself.

I've also found that Spark has some trouble obtaining the reason for a K8S
executor death (as evident by the
spark.kubernetes.executor.lostCheck.maxAttempts config property)

I admittedly don't know what should qualify for a backport, but considering
3.0 is a major upgrade (Scala version, et al), is there any room for for
being more generous with backporting to 2.4?



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Getting the ball started on a 2.4.6 release

2020-04-21 Thread edeesis
I'd like to advocate for:

https://issues.apache.org/jira/browse/SPARK-25515
and
https://issues.apache.org/jira/browse/SPARK-29865

Two small QOL changes that make production use of Spark with Kubernetes much
easier.



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org