[ 
https://issues.apache.org/jira/browse/SPARK-26742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16786543#comment-16786543
 ] 

Stavros Kontopoulos edited comment on SPARK-26742 at 3/7/19 10:11 AM:
----------------------------------------------------------------------

You need a spark distro to pass to `--spark-tgz`.
This is one way to run things:
{noformat}
./dev/make-distribution.sh --name test --r --tgz -Psparkr -Phadoop-2.7 
-Pkubernetes -Phive
tar -zxvf spark-3.0.0-SNAPSHOT-bin-test.tgz
TGZ_PATH=$(pwd)/spark-3.0.0-SNAPSHOT-bin-test.tgz
cd spark-3.0.0-SNAPSHOT-bin-test
./bin/docker-image-tool.sh -n -p 
$(pwd)/kubernetes/dockerfiles/spark/bindings/python/Dockerfile -R \
$(pwd)/kubernetes/dockerfiles/spark/bindings/R/Dockerfile -r $DOCKER_REPO -t 
$SPARK_K8S_IMAGE_TAG build
#push either to local registry and start minikube with it or run  $(minikube 
docker-env) 
# and the use minikube docker daemon or push to an external registry as follows
 
./bin/docker-image-tool.sh -n -p 
$(pwd)/kubernetes/dockerfiles/spark/bindings/python/Dockerfile -R \
$(pwd)/kubernetes/dockerfiles/spark/bindings/R/Dockerfile -r $DOCKER_REPO -t 
$SPARK_K8S_IMAGE_TAG push
cd ../resource-managers/kubernetes/integration-tests
kubectl create -f dev/spark-rbac.yaml
./dev/dev-run-integration-tests.sh --service-account spark-sa --namespace spark 
\
 --image-tag $SPARK_K8S_IMAGE_TAG  --spark-tgz $TGZ_PATH --image-repo 
$DOCKER_REPO
{noformat}

Options for running the tests are listed 
[here|https://github.com/apache/spark/blob/master/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh#L101
].

you can even pass --deploy-mode minikube without a tag name and the script will 
take care the rest for you.


was (Author: skonto):
You need a spark distro to pass to `--spark-tgz`.
This is one way to run things:
{noformat}
./dev/make-distribution.sh --name test --r --tgz -Psparkr -Phadoop-2.7 
-Pkubernetes -Phive
tar -zxvf spark-3.0.0-SNAPSHOT-bin-test.tgz
TGZ_PATH=$(pwd)/spark-3.0.0-SNAPSHOT-bin-test.tgz
cd spark-3.0.0-SNAPSHOT-bin-test
./bin/docker-image-tool.sh -n -p 
$(pwd)/kubernetes/dockerfiles/spark/bindings/python/Dockerfile -R \
$(pwd)/kubernetes/dockerfiles/spark/bindings/R/Dockerfile -r $DOCKER_REPO -t 
$SPARK_K8S_IMAGE_TAG build
#push either to local registry and start minikube with it or run  $(minikube 
docker-env) 
# and the use minikube docker daemon or push to an external registry as follows
 
./bin/docker-image-tool.sh -n -p 
$(pwd)/kubernetes/dockerfiles/spark/bindings/python/Dockerfile -R \
$(pwd)/kubernetes/dockerfiles/spark/bindings/R/Dockerfile -r $DOCKER_REPO -t 
$SPARK_K8S_IMAGE_TAG push
cd ../resource-managers/kubernetes/integration-tests
kubectl create -f dev/spark-rbac.yaml
./dev/dev-run-integration-tests.sh --service-account spark-sa --namespace spark 
\
 --image-tag $SPARK_K8S_IMAGE_TAG  --spark-tgz $TGZ_PATH --image-repo 
$DOCKER_REPO
{noformat}

Options for running the tests are listed here:
https://github.com/apache/spark/blob/master/resource-managers/kubernetes/integration-tests/scripts/setup-integration-test-env.sh#L101

you can even pass --deploy-mode minikube without a tag name and the script will 
take care the rest for you.

> Bump Kubernetes Client Version to 4.1.2
> ---------------------------------------
>
>                 Key: SPARK-26742
>                 URL: https://issues.apache.org/jira/browse/SPARK-26742
>             Project: Spark
>          Issue Type: Dependency upgrade
>          Components: Kubernetes
>    Affects Versions: 2.4.0, 3.0.0
>            Reporter: Steve Davids
>            Priority: Major
>              Labels: easyfix
>             Fix For: 3.0.0
>
>
> Spark 2.x is using Kubernetes Client 3.x which is pretty old, the master 
> branch has 4.0, the client should be upgraded to 4.1.1 to have the broadest 
> Kubernetes compatibility support for newer clusters: 
> https://github.com/fabric8io/kubernetes-client#compatibility-matrix



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to