Hi Spark team,

Any ideas about the above email? Thank you.

BR

ZhaoBo


[image: Mailtrack]
<https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
Sender
notified by
Mailtrack
<https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
19/11/15
上午09:26:17

Tianhua huang <huangtianhua...@gmail.com> 于2019年11月12日周二 下午2:47写道:

> Hi all,
>
> Spark arm jobs have built for some time, and now there are two jobs[1]
> spark-master-test-maven-arm
> <https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-arm/>
> and spark-master-test-python-arm
> <https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-python-arm/>,
> we can see there are some build failures, but it because of the poor
> performance of the arm instance, and now we begin to build spark arm jobs
> on other high performance instances, and the build/test are all success, we
> plan to donate the instance to amplab later.  According to the build
> history, we are very happy to say spark is supported on aarch64 platform,
> and I suggest to add this good news into spark-3.0.0 releasenotes. Maybe
> community could provide an arm-supported release of spark at the meanwhile?
>
> [1]
> https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-arm/
> https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-python-arm/
>
> ps: the jira https://issues.apache.org/jira/browse/SPARK-29106 trace the
> whole work, thank you very much Shane:)
>
> On Thu, Oct 17, 2019 at 2:52 PM bo zhaobo <bzhaojyathousa...@gmail.com>
> wrote:
>
>> Just Notes: The jira issue link is
>> https://issues.apache.org/jira/browse/SPARK-29106
>>
>>
>>
>> [image: Mailtrack]
>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>  Sender
>> notified by
>> Mailtrack
>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>  19/10/17
>> 下午02:50:01
>>
>> Tianhua huang <huangtianhua...@gmail.com> 于2019年10月17日周四 上午10:47写道:
>>
>>> OK, let's update infos there. Thanks.
>>>
>>> On Thu, Oct 17, 2019 at 1:52 AM Shane Knapp <skn...@berkeley.edu> wrote:
>>>
>>>> i totally missed the spark jira from earlier...  let's move the
>>>> conversation there!
>>>>
>>>> On Tue, Oct 15, 2019 at 6:21 PM bo zhaobo <bzhaojyathousa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Shane, Awaresome! We will try the best to finish the test and the
>>>>> requests on the VM recently. Once we finish those things, we will send you
>>>>> an email , then we can continue the following things. Thank you very much.
>>>>>
>>>>> Best Regards,
>>>>>
>>>>> ZhaoBo
>>>>>
>>>>> Shane Knapp <skn...@berkeley.edu> 于 2019年10月16日周三 上午3:47写道:
>>>>>
>>>>>> ok!  i'm able to successfully log in to the VM!
>>>>>>
>>>>>> i also have created a jenkins worker entry:
>>>>>> https://amplab.cs.berkeley.edu/jenkins/computer/spark-arm-vm/
>>>>>>
>>>>>> it's a pretty bare-bones VM, so i have some suggestions/requests
>>>>>> before we can actually proceed w/testing.  i will not be able to perform
>>>>>> any system configuration, as i don't have the cycles to reverse-engineer
>>>>>> the ansible setup and test it all out.
>>>>>>
>>>>>> * java is not installed, please install the following:
>>>>>>   - java8 min version 1.8.0_191
>>>>>>   - java11 min version 11.0.1
>>>>>>
>>>>>> * it appears from the ansible playbook that there are other deps that
>>>>>> need to be installed.
>>>>>>   - please install all deps
>>>>>>   - manually run the tests until they pass
>>>>>>
>>>>>> * the jenkins user should NEVER have sudo or any root-level access!
>>>>>>
>>>>>> * once the arm tests pass when manually run, take a snapshot of this
>>>>>> image so we can recreate it w/o needing to reinstall everything
>>>>>>
>>>>>> after that's done i can finish configuring the jenkins worker and set
>>>>>> up a build...
>>>>>>
>>>>>> thanks!
>>>>>>
>>>>>> shane
>>>>>>
>>>>>>
>>>>>> On Mon, Oct 14, 2019 at 8:34 PM Shane Knapp <skn...@berkeley.edu>
>>>>>> wrote:
>>>>>>
>>>>>>> yes, i will get to that tomorrow.  today was spent cleaning up the
>>>>>>> mess from last week.
>>>>>>>
>>>>>>> On Mon, Oct 14, 2019 at 6:18 PM bo zhaobo <
>>>>>>> bzhaojyathousa...@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi shane,
>>>>>>>>
>>>>>>>> That's great news for Amplab is back. ;-) . If possible, could you
>>>>>>>> please take several minutes to check the ARM VM is accessible from your
>>>>>>>> side? And is there any plan for the whole ARM test integration from
>>>>>>>> you?(how about we finish it this month?) Thanks.
>>>>>>>>
>>>>>>>> Best regards,
>>>>>>>>
>>>>>>>> ZhaoBo
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> [image: Mailtrack]
>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>  Sender
>>>>>>>> notified by
>>>>>>>> Mailtrack
>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>  19/10/15
>>>>>>>> 上午09:13:33
>>>>>>>>
>>>>>>>> bo zhaobo <bzhaojyathousa...@gmail.com> 于2019年10月10日周四 上午8:29写道:
>>>>>>>>
>>>>>>>>> Oh, sorry about we miss that email.  If possible, could you please
>>>>>>>>> take some minutes to test the ARM VM is accessible through your ssh 
>>>>>>>>> private
>>>>>>>>> key with jenkins user? And we plan to make the whole integration 
>>>>>>>>> process
>>>>>>>>> and test could be done before the end of this month. We are very 
>>>>>>>>> happy to
>>>>>>>>> work together with you to move it forward, if you are free and agree 
>>>>>>>>> that.
>>>>>>>>> :)  Thank you very much.
>>>>>>>>>
>>>>>>>>> Best Regards,
>>>>>>>>> Zhao Bo
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Shane Knapp <skn...@berkeley.edu> 于 2019年10月9日周三 下午11:10写道:
>>>>>>>>>
>>>>>>>>>> i spent yesterday dealing w/a power outage on campus.  please see
>>>>>>>>>> my email to the spark dev list.
>>>>>>>>>>
>>>>>>>>>> On Wed, Oct 9, 2019 at 3:29 AM Tianhua huang <
>>>>>>>>>> huangtianhua...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi Shane,
>>>>>>>>>>> Sorry to disturb you again, I wonder if there is a progress of
>>>>>>>>>>> the jenkins arm job?
>>>>>>>>>>> And please don't hesitate to contact us if you have any
>>>>>>>>>>> questions and need helps, thank you very much :)
>>>>>>>>>>>
>>>>>>>>>>> On Tue, Oct 8, 2019 at 9:21 AM bo zhaobo <
>>>>>>>>>>> bzhaojyathousa...@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Hi Shane,
>>>>>>>>>>>>
>>>>>>>>>>>> Sorry for late. We are just back for a 7-days holiday.
>>>>>>>>>>>>
>>>>>>>>>>>> I already make the public key into the VM. If you are free,
>>>>>>>>>>>> please go ahead to test. Thank you
>>>>>>>>>>>>
>>>>>>>>>>>> Best regards
>>>>>>>>>>>>
>>>>>>>>>>>> ZhaoBo
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> [image: Mailtrack]
>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>  Sender
>>>>>>>>>>>> notified by
>>>>>>>>>>>> Mailtrack
>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>  19/10/08
>>>>>>>>>>>> 上午09:16:49
>>>>>>>>>>>>
>>>>>>>>>>>> Shane Knapp <skn...@berkeley.edu> 于2019年10月1日周二 上午2:33写道:
>>>>>>>>>>>>
>>>>>>>>>>>>> here's the public key for the jenkins user:
>>>>>>>>>>>>> ssh-rsa
>>>>>>>>>>>>> AAAAB3NzaC1yc2EAAAABIwAAAgEApe+DQF0SusgpdSDLAeZ5ymbEbUODTMUT67yRCaVD7S4oAWgtHXWSLtgZAlD0D2N2qRm74DVXcCrN+LGIxExXP+h/xAPI+0tMHAFt0+u5zTy+6Fq3ADtG5q4dmNMohk4gZhlueeBN7JT6b3uRLwnrSr2F9DCd5F3gMd2fXAHVGWlOPY01IFwJcHcu4VVPV3pHq35N7TyZGup0Np/D1FtB4Hpw7tyrtiidYfQXE1MFVWLpHXFIoRjMEPGfZw5gfIuejImd22W3Qx9BHPC0e97wOxbQfygZHh8S0J5v6X5dvR/jZZs2queMiNwSDsVnjqDX3vgOIymfgy6xJNjiTTXPNuwEbmk54DCMkqibSY3NmmWPAzzWI1SwU4bSmVExY97TgoLr7hEBQQeMZuKScWVY2tD3yRJz18a3rJGnSboESPpItr5pLlCKcZlvJKM24goo4Uiqi9lLPvJbeXV3FbSiGt9pWDu18XzuZGkamxJzkCKSmhCoxB+fqNXWL7jEcvJw8smF9oTmnGG+in4awCBW11U2wkvPvCwUBWB3tRwHERaG6vTp0CNIKaBH5R968qsWuhbNjlARul5JG3XVUbljjxI4s8W+jRU++Ua2wlDC/6scqgEHlLGbQUO5uHCUTxn+wx7XpEZ6FSucOof0S0TR5mCrsFZ7+eV0nLX6l1cZhs8=
>>>>>>>>>>>>> jenkins@amp-jenkins-master
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> once you get that set up, please let me know and i will test
>>>>>>>>>>>>> ssh-ability.
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Wed, Sep 25, 2019 at 10:15 PM Shane Knapp <
>>>>>>>>>>>>> skn...@berkeley.edu> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks for understanding.  i'm a one-man show.  :)
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> shane
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Sep 25, 2019 at 6:52 PM Tianhua huang <
>>>>>>>>>>>>>> huangtianhua...@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I am very happy to hear that :)
>>>>>>>>>>>>>>> And please don't hesitate to contact us if you have any
>>>>>>>>>>>>>>> questions and need helps, thanks again.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Sep 26, 2019 at 12:34 AM Shane Knapp <
>>>>>>>>>>>>>>> skn...@berkeley.edu> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> i'm hoping to get started by EOW or beginning of next.  i
>>>>>>>>>>>>>>>> have an incredibly small team here @ berkeley and support a 
>>>>>>>>>>>>>>>> lot of research
>>>>>>>>>>>>>>>> labs, and i've been down 25-50% staffing for the past 6 weeks. 
>>>>>>>>>>>>>>>>  thankfully
>>>>>>>>>>>>>>>> my whole team is back and i'm finally getting my head above 
>>>>>>>>>>>>>>>> water and will
>>>>>>>>>>>>>>>> have time to dedicate to this really soon.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> sorry for the delay!
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> shane
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Mon, Sep 23, 2019 at 7:21 PM bo zhaobo <
>>>>>>>>>>>>>>>> bzhaojyathousa...@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Hi Shane,
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> How about to start the ARM work this week? Can we? ;-).
>>>>>>>>>>>>>>>>> Thank you
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Best Regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> [image: Mailtrack]
>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>  Sender
>>>>>>>>>>>>>>>>> notified by
>>>>>>>>>>>>>>>>> Mailtrack
>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>  19/09/24
>>>>>>>>>>>>>>>>> 上午10:18:01
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Shane Knapp <skn...@berkeley.edu> 于2019年9月20日周五 上午11:57写道:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i'll have the cycles over the next week.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Thu, Sep 19, 2019 at 7:51 PM bo zhaobo <
>>>>>>>>>>>>>>>>>> bzhaojyathousa...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Hi Shane,
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Is there any update about the last email? ;-)
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Best Regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> [image: Mailtrack]
>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>  Sender
>>>>>>>>>>>>>>>>>>> notified by
>>>>>>>>>>>>>>>>>>> Mailtrack
>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>  19/09/20
>>>>>>>>>>>>>>>>>>> 上午10:49:35
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> bo zhaobo <bzhaojyathousa...@gmail.com> 于2019年9月18日周三
>>>>>>>>>>>>>>>>>>> 上午10:02写道:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Thanks for reply.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> I will answer you question One by One.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> 0) Sure, I can create an user named 'jenkins' manually.
>>>>>>>>>>>>>>>>>>>> And I will left the 'arm' User if you might need the 
>>>>>>>>>>>>>>>>>>>> higher priority to do
>>>>>>>>>>>>>>>>>>>> something on the VM.
>>>>>>>>>>>>>>>>>>>> 1) Sure, that would be great if you could provide a
>>>>>>>>>>>>>>>>>>>> public key to us. ;-)
>>>>>>>>>>>>>>>>>>>> 2) Sure
>>>>>>>>>>>>>>>>>>>> 3) Yeah, it's a persistent VM now. But from us, we plan
>>>>>>>>>>>>>>>>>>>> to donate more resources beside the end of October. It's 
>>>>>>>>>>>>>>>>>>>> better to move the
>>>>>>>>>>>>>>>>>>>> all ARM jenkins workers to the same place, so this VM 
>>>>>>>>>>>>>>>>>>>> might be recycle in
>>>>>>>>>>>>>>>>>>>> the future. For now, that dosen't stop us to move the CI 
>>>>>>>>>>>>>>>>>>>> job forward.
>>>>>>>>>>>>>>>>>>>> 4) Correct. ;-)
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> If you need more information or help about the ARM VM,
>>>>>>>>>>>>>>>>>>>> please free to contact us.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Questions from us:
>>>>>>>>>>>>>>>>>>>> 1> is there a online chat platform? then we can chat
>>>>>>>>>>>>>>>>>>>> and discuss on that. Seem emailing is too slow for us. 
>>>>>>>>>>>>>>>>>>>> ;-). Or is there a
>>>>>>>>>>>>>>>>>>>> online discuss platform in Spark community?
>>>>>>>>>>>>>>>>>>>> 2> Could you please share the test scripts once you
>>>>>>>>>>>>>>>>>>>> finish the test integration? Also the existing Jenkins 
>>>>>>>>>>>>>>>>>>>> test scripts in
>>>>>>>>>>>>>>>>>>>> Spark CI, as we plan to do the same thing like X86, so We 
>>>>>>>>>>>>>>>>>>>> think testing the
>>>>>>>>>>>>>>>>>>>> same jobs like X86 on ARM would be a good start.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Thank you
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Best  Regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> [image: Mailtrack]
>>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>>  Sender
>>>>>>>>>>>>>>>>>>>> notified by
>>>>>>>>>>>>>>>>>>>> Mailtrack
>>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>>  19/09/18
>>>>>>>>>>>>>>>>>>>> 上午09:33:53
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Shane Knapp <skn...@berkeley.edu> 于2019年9月18日周三
>>>>>>>>>>>>>>>>>>>> 上午5:05写道:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks for the info...  a couple of things:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> 0) could you rename/create a 'jenkins' user with the
>>>>>>>>>>>>>>>>>>>>> same rights?
>>>>>>>>>>>>>>>>>>>>> 1) i will provide an ssh key so that you can add it to
>>>>>>>>>>>>>>>>>>>>> the jenkins user's authorized_keys file
>>>>>>>>>>>>>>>>>>>>> 2) the jenkins user should NOT have root access.  this
>>>>>>>>>>>>>>>>>>>>> is a major security hole
>>>>>>>>>>>>>>>>>>>>> 3) will this be a persistent VM?  if so, i'd much
>>>>>>>>>>>>>>>>>>>>> prefer to have it set up initially so we can just log in, 
>>>>>>>>>>>>>>>>>>>>> build what we
>>>>>>>>>>>>>>>>>>>>> need and launch the job
>>>>>>>>>>>>>>>>>>>>> 4) yay ansible!  i can create the job locally from the
>>>>>>>>>>>>>>>>>>>>> template.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Sep 17, 2019 at 1:16 AM bo zhaobo <
>>>>>>>>>>>>>>>>>>>>> bzhaojyathousa...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Hi @shane knapp <skn...@berkeley.edu> , Thanks.
>>>>>>>>>>>>>>>>>>>>>> Tianhua huang and I already create a ARM VM for this.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> The Infomation about the ARM VM:
>>>>>>>>>>>>>>>>>>>>>> IP: 213.146.141.16
>>>>>>>>>>>>>>>>>>>>>> ssh_key: please see the attachment file
>>>>>>>>>>>>>>>>>>>>>> "lab_ssh_key.txt"
>>>>>>>>>>>>>>>>>>>>>> Then you can login to it with "ssh  -i
>>>>>>>>>>>>>>>>>>>>>> lab_ssh_key.txt arm@213.146.141.16 ", the User "arm"
>>>>>>>>>>>>>>>>>>>>>> have sudo priority.
>>>>>>>>>>>>>>>>>>>>>> Notes: This VM is running on Cloud, now it just
>>>>>>>>>>>>>>>>>>>>>> enable ssh (tcp 22 port) and ping (icmp) for ingress 
>>>>>>>>>>>>>>>>>>>>>> network connection, if
>>>>>>>>>>>>>>>>>>>>>> you need more, please let us know.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> And first we can add periodic job for spark arm test,
>>>>>>>>>>>>>>>>>>>>>> maybe the zuul job details of openlab we tested can be 
>>>>>>>>>>>>>>>>>>>>>> referenced in
>>>>>>>>>>>>>>>>>>>>>> jenkins job
>>>>>>>>>>>>>>>>>>>>>> https://github.com/theopenlab/openlab-zuul-jobs/blob/master/playbooks/spark-unit-test-arm64/run.yaml
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Let's give you some explaination about that file. In
>>>>>>>>>>>>>>>>>>>>>> the Step3 of above file, leveldbjni doesn't have a ARM 
>>>>>>>>>>>>>>>>>>>>>> release, so we build
>>>>>>>>>>>>>>>>>>>>>> leveldbjni on arm and use ourselves leveldbjni jar, 
>>>>>>>>>>>>>>>>>>>>>> which locates on [1].
>>>>>>>>>>>>>>>>>>>>>> So you need to still use the jar to test. And hadoop 
>>>>>>>>>>>>>>>>>>>>>> side also depends on
>>>>>>>>>>>>>>>>>>>>>> leveldbjni, I remember that is "hadoop-client". For this 
>>>>>>>>>>>>>>>>>>>>>> reason, we can not
>>>>>>>>>>>>>>>>>>>>>> change the pom file directly, so we use "mvn install" to 
>>>>>>>>>>>>>>>>>>>>>> enable leveldb
>>>>>>>>>>>>>>>>>>>>>> during testing spark on ARM.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Please Notes: All test runs without ROOT user. So
>>>>>>>>>>>>>>>>>>>>>> that's good if you test with "arm" User.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Thanks,
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Best Regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> [1]
>>>>>>>>>>>>>>>>>>>>>> https://repo1.maven.org/maven2/org/openlabtesting/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> [image: Mailtrack]
>>>>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>>>>  Sender
>>>>>>>>>>>>>>>>>>>>>> notified by
>>>>>>>>>>>>>>>>>>>>>> Mailtrack
>>>>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>>>>  19/09/17
>>>>>>>>>>>>>>>>>>>>>> 下午04:08:59
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Tianhua huang <huangtianhua...@gmail.com>
>>>>>>>>>>>>>>>>>>>>>> 于2019年9月17日周二 下午2:48写道:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> @shane knapp <skn...@berkeley.edu> thank you very
>>>>>>>>>>>>>>>>>>>>>>> much, I opened an issue for this
>>>>>>>>>>>>>>>>>>>>>>> https://issues.apache.org/jira/browse/SPARK-29106,
>>>>>>>>>>>>>>>>>>>>>>> we can tall the details in it :)
>>>>>>>>>>>>>>>>>>>>>>> And we will prepare an arm instance today and will
>>>>>>>>>>>>>>>>>>>>>>> send the info to your email later.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Sep 17, 2019 at 4:40 AM Shane Knapp <
>>>>>>>>>>>>>>>>>>>>>>> skn...@berkeley.edu> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> @Tianhua huang <huangtianhua...@gmail.com> sure, i
>>>>>>>>>>>>>>>>>>>>>>>> think we can get something sorted for the short-term.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> all we need is ssh access (i can provide an ssh
>>>>>>>>>>>>>>>>>>>>>>>> key), and i can then have our jenkins master launch a 
>>>>>>>>>>>>>>>>>>>>>>>> remote worker on that
>>>>>>>>>>>>>>>>>>>>>>>> instance.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> instance setup, etc, will be up to you.  my support
>>>>>>>>>>>>>>>>>>>>>>>> for the time being will be to create the job and 'best 
>>>>>>>>>>>>>>>>>>>>>>>> effort' for
>>>>>>>>>>>>>>>>>>>>>>>> everything else.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> this should get us up and running asap.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> is there an open JIRA for jenkins/arm test
>>>>>>>>>>>>>>>>>>>>>>>> support?  we can move the technical details about this 
>>>>>>>>>>>>>>>>>>>>>>>> idea there.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Sun, Sep 15, 2019 at 9:03 PM Tianhua huang <
>>>>>>>>>>>>>>>>>>>>>>>> huangtianhua...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> @Sean Owen <sro...@gmail.com> , so sorry to reply
>>>>>>>>>>>>>>>>>>>>>>>>> late, we had a Mid-Autumn holiday:)
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> If you hope to integrate ARM CI to amplab jenkins,
>>>>>>>>>>>>>>>>>>>>>>>>> we can offer the arm instance, and then the ARM job 
>>>>>>>>>>>>>>>>>>>>>>>>> will run together with
>>>>>>>>>>>>>>>>>>>>>>>>> other x86 jobs, so maybe there is a guideline to do 
>>>>>>>>>>>>>>>>>>>>>>>>> this? @shane
>>>>>>>>>>>>>>>>>>>>>>>>> knapp <skn...@berkeley.edu>  would you help us?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Sep 12, 2019 at 9:36 PM Sean Owen <
>>>>>>>>>>>>>>>>>>>>>>>>> sro...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> I don't know what's involved in actually
>>>>>>>>>>>>>>>>>>>>>>>>>> accepting or operating those machines, so can't 
>>>>>>>>>>>>>>>>>>>>>>>>>> comment there, but in the
>>>>>>>>>>>>>>>>>>>>>>>>>> meantime it's good that you are running these tests 
>>>>>>>>>>>>>>>>>>>>>>>>>> and can help report
>>>>>>>>>>>>>>>>>>>>>>>>>> changes needed to keep it working with ARM. I would 
>>>>>>>>>>>>>>>>>>>>>>>>>> continue with that for
>>>>>>>>>>>>>>>>>>>>>>>>>> now.
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Sep 11, 2019 at 10:06 PM Tianhua huang <
>>>>>>>>>>>>>>>>>>>>>>>>>> huangtianhua...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Hi all,
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> For the whole work process of spark ARM CI, we
>>>>>>>>>>>>>>>>>>>>>>>>>>> want to make 2 things clear.
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> The first thing is:
>>>>>>>>>>>>>>>>>>>>>>>>>>> About spark ARM CI, now we have two periodic
>>>>>>>>>>>>>>>>>>>>>>>>>>> jobs, one job[1] based on commit[2](which already 
>>>>>>>>>>>>>>>>>>>>>>>>>>> fixed the replay tests
>>>>>>>>>>>>>>>>>>>>>>>>>>> failed issue[3], we made a new test branch based on 
>>>>>>>>>>>>>>>>>>>>>>>>>>> date 09-09-2019), the
>>>>>>>>>>>>>>>>>>>>>>>>>>> other job[4] based on spark master.
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> The first job we test on the specified branch to
>>>>>>>>>>>>>>>>>>>>>>>>>>> prove that our ARM CI is good and stable.
>>>>>>>>>>>>>>>>>>>>>>>>>>> The second job checks spark master every day,
>>>>>>>>>>>>>>>>>>>>>>>>>>> then we can find whether the latest commits affect 
>>>>>>>>>>>>>>>>>>>>>>>>>>> the ARM CI. According to
>>>>>>>>>>>>>>>>>>>>>>>>>>> the build history and result, it shows that some 
>>>>>>>>>>>>>>>>>>>>>>>>>>> problems are easier to
>>>>>>>>>>>>>>>>>>>>>>>>>>> find on ARM like SPARK-28770
>>>>>>>>>>>>>>>>>>>>>>>>>>> <https://issues.apache.org/jira/browse/SPARK-28770>,
>>>>>>>>>>>>>>>>>>>>>>>>>>> and it also shows that we would make efforts to 
>>>>>>>>>>>>>>>>>>>>>>>>>>> trace and figure them
>>>>>>>>>>>>>>>>>>>>>>>>>>> out, till now we have found and fixed several 
>>>>>>>>>>>>>>>>>>>>>>>>>>> problems[5][6][7], thanks
>>>>>>>>>>>>>>>>>>>>>>>>>>> everyone of the community :). And we believe that 
>>>>>>>>>>>>>>>>>>>>>>>>>>> ARM CI is very necessary,
>>>>>>>>>>>>>>>>>>>>>>>>>>> right?
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> The second thing is:
>>>>>>>>>>>>>>>>>>>>>>>>>>> We plan to run the jobs for a period of time,
>>>>>>>>>>>>>>>>>>>>>>>>>>> and you can see the result and logs from 'build 
>>>>>>>>>>>>>>>>>>>>>>>>>>> history' of the jobs
>>>>>>>>>>>>>>>>>>>>>>>>>>> console, if everything goes well for one or two 
>>>>>>>>>>>>>>>>>>>>>>>>>>> weeks could community
>>>>>>>>>>>>>>>>>>>>>>>>>>> accept the ARM CI? or how long the periodic jobs to 
>>>>>>>>>>>>>>>>>>>>>>>>>>> run then our community
>>>>>>>>>>>>>>>>>>>>>>>>>>> could have enough confidence to accept the ARM CI? 
>>>>>>>>>>>>>>>>>>>>>>>>>>> As you suggested before,
>>>>>>>>>>>>>>>>>>>>>>>>>>> it's good to integrate ARM CI to amplab jenkins, we 
>>>>>>>>>>>>>>>>>>>>>>>>>>> agree that and we can
>>>>>>>>>>>>>>>>>>>>>>>>>>> donate the ARM instances and then maintain the 
>>>>>>>>>>>>>>>>>>>>>>>>>>> ARM-related test jobs
>>>>>>>>>>>>>>>>>>>>>>>>>>> together with community, any thoughts?
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Thank you all!
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> [1]
>>>>>>>>>>>>>>>>>>>>>>>>>>> http://status.openlabtesting.org/job/spark-unchanged-branch-unit-test-hadoop-2.7-arm64
>>>>>>>>>>>>>>>>>>>>>>>>>>> [2]
>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/apache/spark/commit/0ed9fae45769d4b06b8cf8128f462f09ff3d9a72
>>>>>>>>>>>>>>>>>>>>>>>>>>> [3]
>>>>>>>>>>>>>>>>>>>>>>>>>>> https://issues.apache.org/jira/browse/SPARK-28770
>>>>>>>>>>>>>>>>>>>>>>>>>>> [4]
>>>>>>>>>>>>>>>>>>>>>>>>>>> http://status.openlabtesting.org/builds?job_name=spark-master-unit-test-hadoop-2.7-arm64
>>>>>>>>>>>>>>>>>>>>>>>>>>> [5] https://github.com/apache/spark/pull/25186
>>>>>>>>>>>>>>>>>>>>>>>>>>> [6] https://github.com/apache/spark/pull/25279
>>>>>>>>>>>>>>>>>>>>>>>>>>> [7] https://github.com/apache/spark/pull/25673
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Fri, Aug 16, 2019 at 11:24 PM Sean Owen <
>>>>>>>>>>>>>>>>>>>>>>>>>>> sro...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Yes, I think it's just local caching. After you
>>>>>>>>>>>>>>>>>>>>>>>>>>>> run the build you should find lots of stuff cached 
>>>>>>>>>>>>>>>>>>>>>>>>>>>> at ~/.m2/repository and
>>>>>>>>>>>>>>>>>>>>>>>>>>>> it won't download every time.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Fri, Aug 16, 2019 at 3:01 AM bo zhaobo <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> bzhaojyathousa...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hi Sean,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks for reply. And very apologize for
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> making you confused.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I know the dependencies will be downloaded
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> from SBT or Maven. But the Spark QA job also exec 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "mvn clean package", why
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the log didn't print "downloading some jar from 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Maven central [1] and build
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> very fast. Is the reason that Spark Jenkins build 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the Spark jars in the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> physical machiines and won't destrory the test 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> env after job is finished?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then the other job build Spark will get the 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dependencies jar from the local
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cached, as the previous jobs exec "mvn package", 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> those dependencies had
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> been downloaded already on local worker machine. 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Am I right? Is that the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> reason the job log[1] didn't print any 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> downloading information from Maven
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Central?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thank you very much.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/job/spark-master-test-maven-hadoop-2.6-ubuntu-testing/lastBuild/consoleFull
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Best regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ZhaoBo
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Mailtrack]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Sender
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> notified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mailtrack
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  19/08/16
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 下午03:58:53
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sean Owen <sro...@gmail.com> 于2019年8月16日周五
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 上午10:38写道:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I'm not sure what you mean. The dependencies
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> are downloaded by SBT and Maven like in any 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> other project, and nothing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> about it is specific to Spark.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> The worker machines cache artifacts that are
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> downloaded from these, but this is a function of 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Maven and SBT, not Spark.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> You may find that the initial download takes a 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> long time.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 15, 2019 at 9:02 PM bo zhaobo <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> bzhaojyathousa...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hi Sean,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Thanks very much for pointing out the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> roadmap. ;-). Then I think we will continue to 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> focus on our test
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> environment.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> For the networking problems, I mean that we
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can access Maven Central, and jobs cloud 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download the required jar package
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with a high network speed. What we want to know 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> is that, why the Spark QA
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> test jobs[1] log shows the job script/maven 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> build seem don't download the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jar packages? Could you tell us the reason 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> about that? Thank you.  The
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> reason we raise the "networking problems" is 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> that we found a phenomenon
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> during we test, if we execute "mvn clean 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> package" in a new test
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> environment(As in our test environment, we will 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> destory the test VMs after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the job is finish), maven will download the 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dependency jar packages from
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Maven Central, but in this job 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "spark-master-test-maven-hadoop" [2], from
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the log, we didn't found it download any jar 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> packages, what the reason
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> about that?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Also we build the Spark jar with downloading
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dependencies from Maven Central, it will cost 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> mostly 1 hour. And we found
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [2] just cost 10min. But if we run "mvn 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> package" in a VM which already exec
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "mvn package" before, it just cost 14min, looks 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> very closer with [2]. So we
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> suspect that downloading the Jar packages cost 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so much time. For the goad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> of ARM CI, we expect the performance of NEW ARM 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> CI could be closer with
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> existing X86 CI, then users could accept it 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> eaiser.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [2]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/job/spark-master-test-maven-hadoop-2.6-ubuntu-testing/lastBuild/consoleFull
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Best regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ZhaoBo
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Mailtrack]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Sender
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> notified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mailtrack
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  19/08/16
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 上午09:48:43
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sean Owen <sro...@gmail.com> 于2019年8月15日周四
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 下午9:58写道:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I think the right goal is to fix the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> remaining issues first. If we set up CI/CD it 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> will only tell us there are
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> still some test failures. If it's stable, and 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> not hard to add to the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> existing CI/CD, yes it could be done 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> automatically later. You can continue
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to test on ARM independently for now.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> It sounds indeed like there are some
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> networking problems in the test system if 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> you're not able to download from
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Maven Central. That rarely takes significant 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> time, and there aren't
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> project-specific mirrors here. You might be 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> able to point at a closer
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> public mirror, depending on where you are.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 15, 2019 at 5:43 AM Tianhua
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> huang <huangtianhua...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hi all,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I want to discuss spark ARM CI again, we
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> took some tests on arm instance based on 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> master and the job includes
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/theopenlab/spark/pull/13  
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> k8s integration
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/theopenlab/spark/pull/17/ ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there are several things I want to talk about:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> First, about the failed tests:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>     1.we have fixed some problems like
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/apache/spark/pull/25186
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/apache/spark/pull/25279,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks sean owen and others to help us.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>     2.we tried k8s integration test on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> arm, and met an error: apk fetch hangs,  the 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> tests passed  after adding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> '--network host' option for command `docker 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> build`, see:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/theopenlab/spark/pull/17/files#diff-5b731b14068240d63a93c393f6f9b1e8R176
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> , the solution refers to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/gliderlabs/docker-alpine/issues/307
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> and I don't know whether it happened once in 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> community CI, or maybe we
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> should submit a pr to pass  '--network host' 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when `docker build`?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>     3.we found there are two tests failed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> after the commit
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/apache/spark/pull/23767
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>        ReplayListenerSuite:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>        - ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>        - End-to-end replay *** FAILED ***
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>          "[driver]" did not equal "[1]"
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> (JsonProtocolSuite.scala:622)
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>        - End-to-end replay with
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> compression *** FAILED ***
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>          "[driver]" did not equal "[1]"
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> (JsonProtocolSuite.scala:622)
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>         we tried to revert the commit and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then the tests passed, the patch is too big 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> and so sorry we can't find the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> reason till now, if you are interesting 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please try it, and it will be very
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> appreciate          if someone can help us to 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> figure it out.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Second, about the test time, we increased
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the flavor of arm instance to 16U16G, but 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> seems there was no significant
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> improvement, the k8s integration test took 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> about one and a half hours, and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the QA test(like 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> spark-master-test-maven-hadoop-2.7 community 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jenkins job)
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> took about seventeen hours(it is too long 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> :(), we suspect that the reason
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> is the performance and network,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> we split the jobs based on projects such
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> as sql, core and so on, the time can be 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> decrease to about seven hours, see
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://github.com/theopenlab/spark/pull/19 We
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> found the Spark QA tests like
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>    ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> it looks all tests seem never download the 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jar packages from maven centry
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> repo(such as
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> https://repo.maven.apache.org/maven2/org/opencypher/okapi-api/0.4.2/okapi-api-0.4.2.jar).
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> So we want to know how the jenkins jobs can 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> do that, is there a internal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> maven repo launched? maybe we can do the same 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thing to avoid the network
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> connection cost during downloading the 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dependent jar packages.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Third, the most important thing, it's
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> about ARM CI of spark, we believe that it is 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> necessary, right? And you can
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> see we really made a lot of efforts, now the 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> basic arm build/test jobs is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ok, so we suggest to add arm jobs to 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> community, we can set them to novoting
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> firstly, and improve/rich the jobs step by 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> step. Generally, there are two
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ways in our mind to integrate the ARM CI for 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> spark:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>      1) We introduce openlab ARM CI into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> spark as a custom CI system. We provide human 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resources and test ARM VMs,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> also we will focus on the ARM related issues 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> about Spark. We will push the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> PR into community.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>      2) We donate ARM VM resources into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> existing amplab Jenkins. We still provide 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> human resources, focus on the ARM
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> related issues about Spark and push the PR 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> into community.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Both options, we will provide human
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resources to maintain, of course it will be 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> great if we can work together.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> So please tell us which option you would 
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> like? And let's move forward.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Waiting for your reply, thank you very much.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>> Shane Knapp
>>>>>>>>>>>>>>>>>>>>>>>> UC Berkeley EECS Research / RISELab Staff Technical
>>>>>>>>>>>>>>>>>>>>>>>> Lead
>>>>>>>>>>>>>>>>>>>>>>>> https://rise.cs.berkeley.edu
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>> Shane Knapp
>>>>>>>>>>>>>>>>>>>>> UC Berkeley EECS Research / RISELab Staff Technical
>>>>>>>>>>>>>>>>>>>>> Lead
>>>>>>>>>>>>>>>>>>>>> https://rise.cs.berkeley.edu
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>> Shane Knapp
>>>>>>>>>>>>>>>>>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>>>>>>>>>>>>>>>>>> https://rise.cs.berkeley.edu
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>> Shane Knapp
>>>>>>>>>>>>>>>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>>>>>>>>>>>>>>>> https://rise.cs.berkeley.edu
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> --
>>>>>>>>>>>>>> Shane Knapp
>>>>>>>>>>>>>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>>>>>>>>>>>>>> https://rise.cs.berkeley.edu
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> --
>>>>>>>>>>>>> Shane Knapp
>>>>>>>>>>>>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>>>>>>>>>>>>> https://rise.cs.berkeley.edu
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> Shane Knapp
>>>>>>>>>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>>>>>>>>>> https://rise.cs.berkeley.edu
>>>>>>>>>>
>>>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Shane Knapp
>>>>>>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>>>>>>> https://rise.cs.berkeley.edu
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Shane Knapp
>>>>>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>>>>>> https://rise.cs.berkeley.edu
>>>>>>
>>>>>
>>>>> [image: Mailtrack]
>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>  Sender
>>>>> notified by
>>>>> Mailtrack
>>>>> <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;>
>>>>>  19/10/16
>>>>> 上午09:13:46
>>>>>
>>>>
>>>>
>>>> --
>>>> Shane Knapp
>>>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>>>> https://rise.cs.berkeley.edu
>>>>
>>>

Reply via email to