Yang Jie created SPARK-36019: -------------------------------- Summary: Cannot run leveldb related UTs on Mac OS of M1 architecture Key: SPARK-36019 URL: https://issues.apache.org/jira/browse/SPARK-36019 Project: Spark Issue Type: Bug Components: Build Affects Versions: 3.3.0 Reporter: Yang Jie
When run leveldb related UTs on Mac OS of M1 architecture, there are some test failed as follows: {code:java} [INFO] Running org.apache.spark.util.kvstore.LevelDBSuite [ERROR] Tests run: 10, Failures: 0, Errors: 10, Skipped: 0, Time elapsed: 0.18 s <<< FAILURE! - in org.apache.spark.util.kvstore.LevelDBSuite [ERROR] org.apache.spark.util.kvstore.LevelDBSuite.testMultipleTypesWriteReadDelete Time elapsed: 0.146 s <<< ERROR! java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in java.library.path, no leveldbjni in java.library.path, /Users/yangjie01/SourceCode/git/spark-mine-12/common/kvstore/target/tmp/libleveldbjni-64-1-7259526109351494242.8: dlopen(/Users/yangjie01/SourceCode/git/spark-mine-12/common/kvstore/target/tmp/libleveldbjni-64-1-7259526109351494242.8, 1): no suitable image found. Did find: /Users/yangjie01/SourceCode/git/spark-mine-12/common/kvstore/target/tmp/libleveldbjni-64-1-7259526109351494242.8: no matching architecture in universal wrapper /Users/yangjie01/SourceCode/git/spark-mine-12/common/kvstore/target/tmp/libleveldbjni-64-1-7259526109351494242.8: no matching architecture in universal wrapper] at org.apache.spark.util.kvstore.LevelDBSuite.setup(LevelDBSuite.java:55) [ERROR] org.apache.spark.util.kvstore.LevelDBSuite.testObjectWriteReadDelete Time elapsed: 0 s <<< ERROR! java.lang.NoClassDefFoundError: Could not initialize class org.fusesource.leveldbjni.JniDBFactory at org.apache.spark.util.kvstore.LevelDBSuite.setup(LevelDBSuite.java:55) .... [ERROR] Tests run: 105, Failures: 0, Errors: 48, Skipped: 0{code} There seems to be a lack of JNI support -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org