xubo245 opened a new issue, #4316:
URL: https://github.com/apache/carbondata/issues/4316

   There are some errors when run the test case with spark2.3
   ```
   - Test restructured array<timestamp> as index column on SI with compaction
   2023-04-10 03:13:52 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   2023-04-10 03:13:53 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   - Test restructured array<string> and string columns as index columns on SI 
with compaction
   2023-04-10 03:13:56 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   2023-04-10 03:13:56 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   - test array<string> on secondary index with compaction
   2023-04-10 03:14:00 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   2023-04-10 03:14:00 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   - test array<string> and string as index columns on secondary index with 
compaction
   - test load data with array<string> on secondary index
   - test SI global sort with si segment merge enabled for complex data types
   - test SI global sort with si segment merge enabled for newly added complex 
column
   - test SI global sort with si segment merge enabled for primitive data types
   - test SI global sort with si segment merge complex data types by rebuild 
command
   - test SI global sort with si segment merge primitive data types by rebuild 
command
   - test si creation with struct and map type
   - test si creation with array
   2023-04-10 03:14:26 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   2023-04-10 03:14:26 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   - test complex with null and empty data
   - test array<date> on secondary index
   - test array<timestamp> on secondary index
   2023-04-10 03:14:31 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   2023-04-10 03:14:31 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   - test array<varchar> and varchar as index columns on secondary index
   2023-04-10 03:14:34 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   2023-04-10 03:14:34 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   - test multiple SI with array and primitive type
   2023-04-10 03:14:40 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   2023-04-10 03:14:40 ERROR CarbonInternalMetastore$:254 - Adding/Modifying 
tableProperties operation failed: Recursive load
   - test SI complex with multiple array contains
   TestCarbonInternalMetastore:
   - test delete index silent
   2023-04-10 03:14:43 ERROR CarbonInternalMetastore$:118 - Exception occurred 
while drop index table for : Some(test).unknown : Table or view 'unknown' not 
found in database 'test';
   2023-04-10 03:14:43 ERROR CarbonInternalMetastore$:131 - Exception occurred 
while drop index table for : Some(test).index1 : Table or view 'index1' not 
found in database 'test';
   - test delete index table silently when exception occur
   org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table or view 
'index1' not found in database 'test';
        at 
org.apache.spark.sql.hive.client.HiveClient$$anonfun$getTable$1.apply(HiveClient.scala:81)
        at 
org.apache.spark.sql.hive.client.HiveClient$$anonfun$getTable$1.apply(HiveClient.scala:81)
        at scala.Option.getOrElse(Option.scala:121)
        at 
org.apache.spark.sql.hive.client.HiveClient$class.getTable(HiveClient.scala:81)
        at 
org.apache.spark.sql.hive.client.HiveClientImpl.getTable(HiveClientImpl.scala:83)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getRawTable$1.apply(HiveExternalCatalog.scala:118)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getRawTable$1.apply(HiveExternalCatalog.scala:118)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.getRawTable(HiveExternalCatalog.scala:117)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply(HiveExternalCatalog.scala:684)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply(HiveExternalCatalog.scala:684)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
        at 
org.apache.spark.sql.hive.HiveExternalCatalog.getTable(HiveExternalCatalog.scala:683)
        at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupRelation(SessionCatalog.scala:674)
        at 
org.apache.spark.sql.hive.CarbonFileMetastore.lookupRelation(CarbonFileMetastore.scala:197)
        at 
org.apache.spark.sql.hive.CarbonFileMetastore.lookupRelation(CarbonFileMetastore.scala:191)
        at 
org.apache.spark.sql.secondaryindex.events.SIDropEventListener$$anonfun$onEvent$1.apply(SIDropEventListener.scala:69)
        at 
org.apache.spark.sql.secondaryindex.events.SIDropEventListener$$anonfun$onEvent$1.apply(SIDropEventListener.scala:65)
        at scala.collection.Iterator$class.foreach(Iterator.scala:893)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
        at 
org.apache.spark.sql.secondaryindex.events.SIDropEventListener.onEvent(SIDropEventListener.scala:65)
        at 
org.apache.carbondata.events.OperationListenerBus.fireEvent(OperationListenerBus.java:83)
        at org.apache.carbondata.events.package$.withEvents(package.scala:26)
        at org.apache.carbondata.events.package$.withEvents(package.scala:22)
        at 
org.apache.spark.sql.execution.command.table.CarbonDropTableCommand.processMetadata(CarbonDropTableCommand.scala:93)
        at 
org.apache.spark.sql.execution.command.AtomicRunnableCommand$$anonfun$run$3.apply(package.scala:160)
        at 
org.apache.spark.sql.execution.command.AtomicRunnableCommand$$anonfun$run$3.apply(package.scala:159)
        at 
org.apache.spark.sql.execution.command.Auditable$class.runWithAudit(package.scala:118)
        at 
org.apache.spark.sql.execution.command.AtomicRunnableCommand.runWithAudit(package.scala:155)
        at 
org.apache.spark.sql.execution.command.AtomicRunnableCommand.run(package.scala:159)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
        at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
        at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
        at org.apache.spark.sql.Dataset$$anonfun$51.apply(Dataset.scala:3265)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
        at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3264)
        at org.apache.spark.sql.Dataset.<init>(Dataset.scala:190)
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
        at 
org.apache.spark.sql.test.SparkTestQueryExecutor.sql(SparkTestQueryExecutor.scala:37)
        at org.apache.spark.sql.test.util.QueryTest.sql(QueryTest.scala:123)
        at 
org.apache.carbondata.spark.testsuite.secondaryindex.TestCarbonInternalMetastore.beforeEach(TestCarbonInternalMetastore.scala:49)
        at 
org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:220)
        at 
org.apache.carbondata.spark.testsuite.secondaryindex.TestCarbonInternalMetastore.runTest(TestCarbonInternalMetastore.scala:33)
        at 
org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
        at 
org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
        at 
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
        at 
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
        at 
org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
        at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
        at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
        at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
        at org.scalatest.Suite$class.run(Suite.scala:1147)
        at 
org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
        at 
org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
        at 
org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
        at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
        at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
        at 
org.apache.carbondata.spark.testsuite.secondaryindex.TestCarbonInternalMetastore.org$scalatest$BeforeAndAfterAll$$super$run(TestCarbonInternalMetastore.scala:33)
        at 
org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
        at 
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
        at 
org.apache.carbondata.spark.testsuite.secondaryindex.TestCarbonInternalMetastore.run(TestCarbonInternalMetastore.scala:33)
        at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1210)
        at 
org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1257)
        at 
org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1255)
        at 
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at org.scalatest.Suite$class.runNestedSuites(Suite.scala:1255)
        at 
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
        at org.scalatest.Suite$class.run(Suite.scala:1144)
        at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
        at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
        at 
org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
        at 
org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1334)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1334)
        at 
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
        at 
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
        at 
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1500)
        at 
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
        at org.scalatest.tools.Runner$.main(Runner.scala:827)
        at org.scalatest.tools.Runner.main(Runner.scala)
   - test show index when SI were created before the change CARBONDATA-3765
   - test refresh index with different value of isIndexTableExists
   - test refresh index with indexExists as false and empty index table
   - test refresh index with indexExists as null
   Run completed in 15 minutes, 36 seconds.
   Total number of tests run: 283
   Suites: completed 32, aborted 0
   Tests: succeeded 282, failed 1, canceled 0, ignored 1, pending 0
   *** 1 TEST FAILED ***
   [INFO] 
------------------------------------------------------------------------
   [INFO] Reactor Summary:
   [INFO] 
   [INFO] Apache CarbonData :: Parent ........................ SUCCESS [  2.509 
s]
   [INFO] Apache CarbonData :: Common ........................ SUCCESS [ 15.990 
s]
   [INFO] Apache CarbonData :: Format ........................ SUCCESS [ 32.657 
s]
   [INFO] Apache CarbonData :: Core .......................... SUCCESS [01:32 
min]
   [INFO] Apache CarbonData :: Processing .................... SUCCESS [ 33.903 
s]
   [INFO] Apache CarbonData :: Hadoop ........................ SUCCESS [ 22.800 
s]
   [INFO] Apache CarbonData :: Materialized View Plan ........ SUCCESS [01:15 
min]
   [INFO] Apache CarbonData :: Hive .......................... SUCCESS [02:05 
min]
   [INFO] Apache CarbonData :: SDK ........................... SUCCESS [02:03 
min]
   [INFO] Apache CarbonData :: CLI ........................... SUCCESS [05:03 
min]
   [INFO] Apache CarbonData :: Lucene Index .................. SUCCESS [ 22.601 
s]
   [INFO] Apache CarbonData :: Bloom Index ................... SUCCESS [ 12.992 
s]
   [INFO] Apache CarbonData :: Geo ........................... SUCCESS [ 23.719 
s]
   [INFO] Apache CarbonData :: Streaming ..................... SUCCESS [ 33.608 
s]
   [INFO] Apache CarbonData :: Spark ......................... FAILURE [  01:27 
h]
   [INFO] Apache CarbonData :: Secondary Index ............... FAILURE [16:28 
min]
   [INFO] Apache CarbonData :: Index Examples ................ SUCCESS [ 11.280 
s]
   [INFO] Apache CarbonData :: Flink Proxy ................... SUCCESS [ 15.864 
s]
   [INFO] Apache CarbonData :: Flink ......................... SUCCESS [05:29 
min]
   [INFO] Apache CarbonData :: Flink Build ................... SUCCESS [  5.949 
s]
   [INFO] Apache CarbonData :: Presto ........................ SUCCESS [02:37 
min]
   [INFO] Apache CarbonData :: Examples ...................... SUCCESS [02:54 
min]
   [INFO] Apache CarbonData :: Flink Examples ................ SUCCESS [  8.313 
s]
   [INFO] Apache CarbonData :: Assembly ...................... FAILURE [ 14.763 
s]
   [INFO] 
------------------------------------------------------------------------
   [INFO] BUILD FAILURE
   [INFO] 
------------------------------------------------------------------------
   [INFO] Total time: 01:54 h (Wall Clock)
   [INFO] Finished at: 2023-04-10T03:14:53+08:00
   [INFO] Final Memory: 245M/2221M
   [INFO] 
------------------------------------------------------------------------
   [ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test 
(test) on project carbondata-spark_2.3: There are test failures -> [Help 1]
   [ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-shade-plugin:2.4.3:shade (default) on project 
carbondata-assembly: Error creating shaded jar: 
/Users/xubo/Desktop/xubo/git/carbondata1/integration/spark/target/classes (Is a 
directory) -> [Help 2]
   [ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test 
(test) on project carbondata-secondary-index: There are test failures -> [Help 
1]
   [ERROR] 
   [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
   [ERROR] Re-run Maven using the -X switch to enable full debug logging.
   [ERROR] 
   [ERROR] For more information about the errors and possible solutions, please 
read the following articles:
   [ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
   [ERROR] [Help 2] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
   [ERROR] 
   [ERROR] After correcting the problems, you can resume the build with the 
command
   [ERROR]   mvn <goals> -rf :carbondata-spark_2.3
   [INFO] Build failures were ignored.
   
   Process finished with exit code 0
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@carbondata.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to