RE: Conda Python Env in K8S

2021-12-03 Thread Bode, Meikel, NMA-CFD
Hi Mich, sure thats possible. But distributing the complete env would be more practical. A workaround at the moment is, that we build different environments and store them in a pv and then we mount it into the pods and refer from the SparkApplication resource to the desired env.. But actually

Re: Conda Python Env in K8S

2021-12-03 Thread Mich Talebzadeh
Build python packages into the docker image itself first with pip install RUN pip install panda . . —no-cache HTH On Fri, 3 Dec 2021 at 11:58, Bode, Meikel, NMA-CFD < meikel.b...@bertelsmann.de> wrote: > Hello, > > > > I am trying to run spark jobs using Spark Kubernetes Operator. > > But when

Re: Exploding huge array elements in spark

2021-12-03 Thread Gourav Sengupta
Hi Srikanth, what is the spark version that you are using? Can you tell us the data dictionary and the PK? Also if possible the data volumes that you are dealing with? Thanks and Regards, Gourav Sengupta On Thu, Dec 2, 2021 at 4:33 PM Shrikanth J R wrote: > Hi, > > I am facing an issue when

Conda Python Env in K8S

2021-12-03 Thread Bode, Meikel, NMA-CFD
Hello, I am trying to run spark jobs using Spark Kubernetes Operator. But when I try to bundle a conda python environment using the following resource description the python interpreter is only unpack to the driver and not to the executors. apiVersion: "sparkoperator.k8s.io/v1beta2" kind: