[ 
https://issues.apache.org/jira/browse/SPARK-26742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16786543#comment-16786543
 ] 

Stavros Kontopoulos edited comment on SPARK-26742 at 3/7/19 9:47 AM:
---------------------------------------------------------------------

You need a spark distro to pass to `--spark-tgz`.
This is one way to run things:
{noformat}
./dev/make-distribution.sh --name test --r --tgz -Psparkr -Phadoop-2.7 
-Pkubernetes -Phive
tar -zxvf spark-3.0.0-SNAPSHOT-bin-test.tgz
cd spark-3.0.0-SNAPSHOT-bin-test
./bin/docker-image-tool.sh -n -p 
$(pwd)/kubernetes/dockerfiles/spark/bindings/python/Dockerfile -R \
$(pwd)/kubernetes/dockerfiles/spark/bindings/R/Dockerfile -r $DOCKER_USERNAME 
-t $SPARK_K8S_IMAGE_TAG build
#push either to local registry and start minikube with it or run  $(minikube 
docker-env) 
# and the use minikube docker daemon or push to an external registry
 
./bin/docker-image-tool.sh -n -p 
$(pwd)/kubernetes/dockerfiles/spark/bindings/python/Dockerfile -R \
$(pwd)/kubernetes/dockerfiles/spark/bindings/R/Dockerfile -r $DOCKER_USERNAME 
-t $SPARK_K8S_IMAGE_TAG push
cd ../resource-managers/kubernetes/integration-tests
kubectl create -f dev/spark-rbac.yaml
./dev/dev-run-integration-tests.sh --service-account spark-sa --namespace spark 
--image-tag $SPARK_K8S_IMAGE_TAG --spark-tgz $TGZ_PATH --image-repo 
$DOCKER_USERNAME
{noformat}



was (Author: skonto):
You need a spark distro to pass to `--spark-tgz`.
This is one way to run things:
{noformat}
./dev/make-distribution.sh --name test --r --tgz -Psparkr -Phadoop-2.7 
-Pkubernetes -Phive
tar -zxvf spark-3.0.0-SNAPSHOT-bin-test.tgz
cd spark-3.0.0-SNAPSHOT-bin-test
./bin/docker-image-tool.sh -n -p 
$(pwd)/kubernetes/dockerfiles/spark/bindings/python/Dockerfile -R \
$(pwd)/kubernetes/dockerfiles/spark/bindings/R/Dockerfile -r $DOCKER_USERNAME 
-t $SPARK_K8S_IMAGE_TAG build
\#push either to local registry and start minikube with it or run  $(minikube 
docker-env) and the use minikube docker daemon or push to an external registry
 
./bin/docker-image-tool.sh -n -p 
$(pwd)/kubernetes/dockerfiles/spark/bindings/python/Dockerfile -R \
$(pwd)/kubernetes/dockerfiles/spark/bindings/R/Dockerfile -r $DOCKER_USERNAME 
-t $SPARK_K8S_IMAGE_TAG push
cd ../resource-managers/kubernetes/integration-tests
kubectl create -f dev/spark-rbac.yaml
./dev/dev-run-integration-tests.sh --service-account spark-sa --namespace spark 
--image-tag $SPARK_K8S_IMAGE_TAG --spark-tgz $TGZ_PATH --image-repo 
$DOCKER_USERNAME
{noformat}


> Bump Kubernetes Client Version to 4.1.2
> ---------------------------------------
>
>                 Key: SPARK-26742
>                 URL: https://issues.apache.org/jira/browse/SPARK-26742
>             Project: Spark
>          Issue Type: Dependency upgrade
>          Components: Kubernetes
>    Affects Versions: 2.4.0, 3.0.0
>            Reporter: Steve Davids
>            Priority: Major
>              Labels: easyfix
>             Fix For: 3.0.0
>
>
> Spark 2.x is using Kubernetes Client 3.x which is pretty old, the master 
> branch has 4.0, the client should be upgraded to 4.1.1 to have the broadest 
> Kubernetes compatibility support for newer clusters: 
> https://github.com/fabric8io/kubernetes-client#compatibility-matrix



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to