dongjoon-hyun commented on a change in pull request #25748: 
[SPARK-28904][KUBERNETES] Create mount for PvTestSuire
URL: https://github.com/apache/spark/pull/25748#discussion_r324375006
 
 

 ##########
 File path: 
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/PVTestsSuite.scala
 ##########
 @@ -42,23 +48,14 @@ private[spark] trait PVTestsSuite { k8sSuite: 
KubernetesSuite =>
       .withKind("PersistentVolume")
       .withApiVersion("v1")
       .withNewMetadata()
-        .withName("test-local-pv")
+        .withName(PV_NAME)
       .endMetadata()
       .withNewSpec()
         .withCapacity(Map("storage" -> new 
QuantityBuilder().withAmount("1Gi").build()).asJava)
         .withAccessModes("ReadWriteOnce")
         .withPersistentVolumeReclaimPolicy("Retain")
-        .withStorageClassName("test-local-storage")
-        .withLocal(new LocalVolumeSourceBuilder().withPath(VM_PATH).build())
-          .withNewNodeAffinity()
-            .withNewRequired()
-              .withNodeSelectorTerms(new NodeSelectorTermBuilder()
-                .withMatchExpressions(new NodeSelectorRequirementBuilder()
-                  .withKey("kubernetes.io/hostname")
-                  .withOperator("In")
-                  .withValues("minikube", "docker-for-desktop", 
"docker-desktop").build()).build())
 
 Review comment:
   It seems that I wasn't clear enough.
   
   What I meant in the above is that the following is wrong.
   > I'm saying this test is already excluded from non-minikube tests 
   
   The following is the result of `DockerDesktop` backend on the master branch 
as of today.
   Please try in your environment.
   
   ```
   $ kubectl cluster-info
   Kubernetes master is running at https://kubernetes.docker.internal:6443
   KubeDNS is running at 
https://kubernetes.docker.internal:6443/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy
   
   $ ./dev/make-distribution.sh --pip --tgz -Phadoop-2.7 -Pkubernetes
   
   $ 
resource-managers/kubernetes/integration-tests/dev/dev-run-integration-tests.sh 
--deploy-mode docker-for-desktop --spark-tgz $PWD/spark-*.tgz
   Run starting. Expected test count is: 18
   KubernetesSuite:
   - Run SparkPi with no resources
   - Run SparkPi with a very long application name.
   - Use SparkLauncher.NO_RESOURCE
   - Run SparkPi with a master URL without a scheme.
   - Run SparkPi with an argument.
   - Run SparkPi with custom labels, annotations, and environment variables.
   - All pods have the same service account by default
   - Run extraJVMOptions check on driver
   - Run SparkRemoteFileTest using a remote data file
   - Run SparkPi with env and mount secrets.
   - Run PySpark on simple pi.py example
   - Run PySpark with Python2 to test a pyfiles example
   - Run PySpark with Python3 to test a pyfiles example
   - Run PySpark with memory customization
   - Run in client mode.
   - Start pod creation from template
   - PVs with local storage
   - Launcher client dependencies *** FAILED ***
   Tests: succeeded 17, failed 1, canceled 0, ignored 0, pending 0
   *** 1 TEST FAILED ***
   ```
   
   I didn't test this on this PR. Please try to run it. That was my initial 
request, @holdenk .

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to