[ 
https://issues.apache.org/jira/browse/SPARK-34738?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17312517#comment-17312517
 ] 

Shane Knapp edited comment on SPARK-34738 at 3/31/21, 4:20 PM:
---------------------------------------------------------------

alright, sometimes these things go smoothly, sometimes not.

this is firmly in the 'not' camp.

after upgrading minikube and k8s, i was unable to mount a persistent volume 
when using the kvm2 driver.  much debugging ensued.  no progress was made.

so, i decided to randomly try the docker minikube driver.  voila!  i'm now able 
to happily mount persistent volumes.

however, when running the k8s integration test, everything passes *except* the 
PVs w/local storage.

from [https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s-clone:]
{code:java}
- PVs with local storage *** FAILED ***
 The code passed to eventually never returned normally. Attempted 179 times 
over 3.002424470466666 minutes. Last failure message: container not found 
("spark-kubernetes-driver"). (PVTestsSuite.scala:117){code}
i've never seen this error before, and apparently there aren't many things 

here's how we launch minikube and create the mount:
{code:java}
minikube --vm-driver=docker start --memory 6000 --cpus 8
minikube mount ${PVC_TESTS_HOST_PATH}:${PVC_TESTS_VM_PATH} 
--9p-version=9p2000.L &
{code}
we're using ZFS on the bare metal, and minikube is complaining:
{code:java}
! docker is currently using the zfs storage driver, consider switching to 
overlay2 for better performance{code}
i'll continue to dig in to this today, but i'm currently blocked...


was (Author: shaneknapp):
alright, sometimes these things go smoothly, sometimes not.

this is firmly in the 'not' camp.

after upgrading minikube and k8s, i was unable to mount a persistent volume 
when using the kvm2 driver.  much debugging ensued.  no progress was made.

so, i decided to randomly try the docker minikube driver.  voila!  i'm now able 
to happily mount persistent volumes.

however, when running the k8s integration test, everything passes *except* the 
PVs w/local storage.

from https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-k8s-clone:
{code:java}
- PVs with local storage *** FAILED ***
 The code passed to eventually never returned normally. Attempted 179 times 
over 3.002424470466666 minutes. Last failure message: container not found 
("spark-kubernetes-driver"). (PVTestsSuite.scala:117){code}
i've never seen this error before, and apparently there aren't many things 

here's how we launch minikube and create the mount:

 
{code:java}
minikube --vm-driver=docker start --memory 6000 --cpus 8
minikube mount ${PVC_TESTS_HOST_PATH}:${PVC_TESTS_VM_PATH} 
--9p-version=9p2000.L &
{code}
i'll continue to dig in to this today, but i'm currently blocked...

> Upgrade Minikube and kubernetes cluster version on Jenkins
> ----------------------------------------------------------
>
>                 Key: SPARK-34738
>                 URL: https://issues.apache.org/jira/browse/SPARK-34738
>             Project: Spark
>          Issue Type: Task
>          Components: jenkins, Kubernetes
>    Affects Versions: 3.2.0
>            Reporter: Attila Zsolt Piros
>            Assignee: Shane Knapp
>            Priority: Major
>
> [~shaneknapp] as we discussed [on the mailing 
> list|http://apache-spark-developers-list.1001551.n3.nabble.com/minikube-and-kubernetes-cluster-versions-for-integration-testing-td30856.html]
>  Minikube can be upgraded to the latest (v1.18.1) and kubernetes version 
> should be v1.17.3 (`minikube config set kubernetes-version v1.17.3`).
> [Here|https://github.com/apache/spark/pull/31829] is my PR which uses a new 
> method to configure the kubernetes client. Thanks in advance to use it for 
> testing on the Jenkins after the Minikube version is updated.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to