[ https://issues.apache.org/jira/browse/SPARK-34732?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-34732. ----------------------------------- Fix Version/s: 3.2.0 Resolution: Fixed Issue resolved by pull request 31824 [https://github.com/apache/spark/pull/31824] > The logForFailedTest throws an exception when driver is not started > ------------------------------------------------------------------- > > Key: SPARK-34732 > URL: https://issues.apache.org/jira/browse/SPARK-34732 > Project: Spark > Issue Type: Bug > Components: Kubernetes, Tests > Affects Versions: 3.2.0 > Reporter: Attila Zsolt Piros > Assignee: Attila Zsolt Piros > Priority: Minor > Fix For: 3.2.0 > > > In SPARK-34426 the logForFailedTest method is introduced to add the driver > and executors log to the integration-tests.log but when the driver failed to > start an exception will be thrown as the list of pods is empty but we still > try to access the first item from the list: > {noformat} > - PVs with local storage *** FAILED *** > java.lang.IndexOutOfBoundsException: Index: 0, Size: 0 > at java.util.ArrayList.rangeCheck(ArrayList.java:659) > at java.util.ArrayList.get(ArrayList.java:435) > at > org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite.logForFailedTest(KubernetesSuite.scala:83) > at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:181) > at > org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188) > at > org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200) > at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306) > at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:200) > at > org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:182) > at > org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:61) > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org