[ 
https://issues.apache.org/jira/browse/SPARK-29106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16954195#comment-16954195
 ] 

huangtianhua commented on SPARK-29106:
--------------------------------------

[~shaneknapp], thanks for working on this :)

Yes, I agree with [~bzhaoopenstack], maybe we can provide a shell script that 
do things as following:
 # update spark repo with master? Or there is an other way you could do this?
 # call mvn build or mvn test like './build/mvn clean package -DskipTests 
-Phadoop-2.7 -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl -Pmesos' and 
'./build/mvn test -Phadoop-2.7 -Pyarn -Phive -Phive-thriftserver -Pkinesis-asl 
-Pmesos'

And there is important thing that we installed our leveldbjni-all 1.8 locally, 
which I mentioned before, we released a new jar to support aarch64 platform.

Now the demo ansible script has three steps to run the tests:
 # build spark
 # run tests without install our leveldbjni jar package, an error will raised 
like " [no leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in 
java.library.path, no leveldbjni in java.library.path, 
/usr/local/src/spark/common/kvstore/target/tmp/libleveldbjni-64-1-610267671268036503.8:
 
/usr/local/src/spark/common/kvstore/target/tmp/libleveldbjni-64-1-610267671268036503.8:
 cannot open shared object file: No such file or directory (Possible cause: 
can't load AMD 64-bit .so on a AARCH64-bit platform)]" 
 # run tests with our leveldbjni jar which supports aarch64 platform, and then 
the tests will be past.

I think we don't have to run the step2 anymore on community jenkins, right? We 
will remove the step in script.

> Add jenkins arm test for spark
> ------------------------------
>
>                 Key: SPARK-29106
>                 URL: https://issues.apache.org/jira/browse/SPARK-29106
>             Project: Spark
>          Issue Type: Test
>          Components: Tests
>    Affects Versions: 3.0.0
>            Reporter: huangtianhua
>            Priority: Minor
>
> Add arm test jobs to amplab jenkins for spark.
> Till now we made two arm test periodic jobs for spark in OpenLab, one is 
> based on master with hadoop 2.7(similar with QA test of amplab jenkins), 
> other one is based on a new branch which we made on date 09-09, see  
> [http://status.openlabtesting.org/builds/job/spark-master-unit-test-hadoop-2.7-arm64]
>   and 
> [http://status.openlabtesting.org/builds/job/spark-unchanged-branch-unit-test-hadoop-2.7-arm64.|http://status.openlabtesting.org/builds/job/spark-unchanged-branch-unit-test-hadoop-2.7-arm64]
>  We only have to care about the first one when integrate arm test with amplab 
> jenkins.
> About the k8s test on arm, we have took test it, see 
> [https://github.com/theopenlab/spark/pull/17], maybe we can integrate it 
> later. 
> And we plan test on other stable branches too, and we can integrate them to 
> amplab when they are ready.
> We have offered an arm instance and sent the infos to shane knapp, thanks 
> shane to add the first arm job to amplab jenkins :) 
> The other important thing is about the leveldbjni 
> [https://github.com/fusesource/leveldbjni,|https://github.com/fusesource/leveldbjni/issues/80]
>  spark depends on leveldbjni-all-1.8 
> [https://mvnrepository.com/artifact/org.fusesource.leveldbjni/leveldbjni-all/1.8],
>  we can see there is no arm64 supporting. So we build an arm64 supporting 
> release of leveldbjni see 
> [https://mvnrepository.com/artifact/org.openlabtesting.leveldbjni/leveldbjni-all/1.8],
>  but we can't modified the spark pom.xml directly with something like 
> 'property'/'profile' to choose correct jar package on arm or x86 platform, 
> because spark depends on some hadoop packages like hadoop-hdfs, the packages 
> depend on leveldbjni-all-1.8 too, unless hadoop release with new arm 
> supporting leveldbjni jar. Now we download the leveldbjni-al-1.8 of 
> openlabtesting and 'mvn install' to use it when arm testing for spark.
> PS: The issues found and fixed:
>  SPARK-28770
>  [https://github.com/apache/spark/pull/25673]
>   
>  SPARK-28519
>  [https://github.com/apache/spark/pull/25279]
>   
>  SPARK-28433
>  [https://github.com/apache/spark/pull/25186]
>  
> SPARK-28467
> [https://github.com/apache/spark/pull/25864]
>  
> SPARK-29286
> [https://github.com/apache/spark/pull/26021]
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to