Hi all,
Recently I run './build/mvn test' of spark on aarch64, and master and
branch-2.4 are all failled, the log pieces as below:

......

[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.spark.util.kvstore.LevelDBTypeInfoSuite
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.081 s - in org.apache.spark.util.kvstore.LevelDBTypeInfoSuite
[INFO] Running org.apache.spark.util.kvstore.InMemoryStoreSuite
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.001 s - in org.apache.spark.util.kvstore.InMemoryStoreSuite
[INFO] Running org.apache.spark.util.kvstore.InMemoryIteratorSuite
[INFO] Tests run: 38, Failures: 0, Errors: 0, Skipped: 0, Time elapsed:
0.219 s - in org.apache.spark.util.kvstore.InMemoryIteratorSuite
[INFO] Running org.apache.spark.util.kvstore.LevelDBIteratorSuite
[ERROR] Tests run: 38, Failures: 0, Errors: 38, Skipped: 0, Time elapsed:
0.23 s <<< FAILURE! - in org.apache.spark.util.kvstore.LevelDBIteratorSuite
[ERROR] 
copyIndexDescendingWithStart(org.apache.spark.util.kvstore.LevelDBIteratorSuite)
Time elapsed: 0.2 s <<< ERROR!
java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no
leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in
java.library.path, no leveldbjni in java.library.path,
/usr/local/src/spark/common/kvstore/target/tmp/libleveldbjni-64-1-610267671268036503.8:
/usr/local/src/spark/common/kvstore/target/tmp/libleveldbjni-64-1-610267671268036503.8:
cannot open shared object file: No such file or directory (Possible cause:
can't load AMD 64-bit .so on a AARCH64-bit platform)]
at
org.apache.spark.util.kvstore.LevelDBIteratorSuite.createStore(LevelDBIteratorSuite.java:44)

......

There is a dependency of  leveldbjni-all  , but there is no the native
package for aarch64 i in leveldbjni-1.8(all) .jar, I found aarch64 is
supported after pr https://github.com/fusesource/leveldbjni/pull/82, but it
was not in the 1.8 release, and unfortunately the repo didn't updated
almost for

two years.

So I have a question: does spark support aarch64, and if it is yes, then
how to fix this problem, if it is not, what's

the plan for it? Thank you all!

Reply via email to