Re: Spark 3.2.1 in Google Kubernetes Version 1.19 or 1.21 - SparkSubmit Error

2022-02-18 Thread Mich Talebzadeh
Hi, I need to arrange a class for members using GCP with Dataproc or GCP with Kubernetes I think 樂 Ok it is a good practice to create namespace spark for this purpose rather than using default namespace kubectl create namespace spark Tell me exactly what you are trying to do? Are you running

Re: Spark 3.2.1 in Google Kubernetes Version 1.19 or 1.21 - SparkSubmit Error

2022-02-18 Thread Gnana Kumar
Hi Mich I'm running spark from GCP Platform and this is the error. Exception in thread "main" io.fabric8.kubernetes.client.KubernetesClientException: Operation: [create] for kind: [Pod] with name: [null] in namespace: [default] failed. Thanks GK On Fri, Feb 18, 2022 at 12:37 AM Mich

Re: Spark 3.2.1 in Google Kubernetes Version 1.19 or 1.21 - SparkSubmit Error

2022-02-17 Thread Mich Talebzadeh
Just a create directory as below on gcp storage bucket CODE_DIRECTORY_CLOUD="gs://spark-on-k8s/codes/" Put your jar file there gsutil cp /opt/spark/examples/jars/spark-examples_2.12-3.2.1.jar $CODE_DIRECTORY_CLOUD --conf spark.kubernetes.file.upload.path=file:///tmp \

Re: Spark 3.2.1 in Google Kubernetes Version 1.19 or 1.21 - SparkSubmit Error

2022-02-17 Thread Gnana Kumar
Though I have created the kubernetes RBAC as per Spark site in my GKE cluster,Im getting POD NAME null error. kubectl create serviceaccount spark kubectl create clusterrolebinding spark-role --clusterrole=edit --serviceaccount=default:spark --namespace=default On Thu, Feb 17, 2022 at 11:31 PM

Re: Spark 3.2.1 in Google Kubernetes Version 1.19 or 1.21 - SparkSubmit Error

2022-02-17 Thread Gnana Kumar
Hi Mich This is the latest error I'm stuck with. Please help me resolve this issue. Exception in thread "main" io.fabric8.kubernetes.client.KubernetesClientException: Operation: [create] for kind: [Pod] with name: [null] in namespace: [default] failed.

Re: Spark 3.2.1 in Google Kubernetes Version 1.19 or 1.21 - SparkSubmit Error

2022-02-17 Thread Mich Talebzadeh
Hi Gnana, That JAR file /home/gnana_kumar123/spark/spark-3.2.1- bin-hadoop3.2/examples/jars/spark-examples_2.12-3.2.1.jar, is not visible to the GKE cluster such that all nodes can read it. I suggest that you put it on gs:// bucket in GCP and access it from there. HTH view my Linkedin