[ https://issues.apache.org/jira/browse/HUDI-1211?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17370315#comment-17370315 ]
Raymond Xu commented on HUDI-1211: ---------------------------------- [~shivnarayan] is this still an issue? I couldn't repro this in my local > Test failures w/ some index tests (TestHoodieIndex) > --------------------------------------------------- > > Key: HUDI-1211 > URL: https://issues.apache.org/jira/browse/HUDI-1211 > Project: Apache Hudi > Issue Type: Bug > Components: Testing > Affects Versions: 0.8.0 > Reporter: sivabalan narayanan > Assignee: Raymond Xu > Priority: Major > Labels: pull-request-available, sev:high > > When I was running tests for all of index package, tests fails with > connection issue following which all tests start to fail. Once the connection > issue starts, I see below errors for successive tests. > > Command : "mvn '-Dtest=org.apache.hudi.index.**' -DfailIfNoTests=false test" > <<< ERROR! > org.apache.spark.SparkException: > Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore > this error, set spark.driver.allowMultipleContexts = true. The currently > running SparkContext was created at: > org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926) > org.apache.hudi.testutils.FunctionalTestHarness.runBeforeEach(FunctionalTestHarness.java:132) > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > Here is the full stack trace: > [https://gist.github.com/nsivabalan/0643202bf1af2d85cdc0d35dd6a68d36] -- This message was sent by Atlassian Jira (v8.3.4#803005)