Re: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution of pTest with LAVA

2018-08-29 Thread Otavio Salvador
On Mon, Aug 27, 2018 at 7:08 PM Yang Wang  wrote:
>
> On 08/26/2018 10:04 PM, Otavio Salvador wrote:
> > On Fri, Aug 24, 2018 at 11:23 AM Wang, Yang (Young)
> >  wrote:
> >> I see some good feedback from people mentioned they are working on similar 
> >> solutions and should come out soon. We will continuously work on this 
> >> ticket.
> > Did you consider using labgrid? It has a lot in common in current
> > infra we use in OEQA (ptest and like) and has nice capabilities to be
> > used locally and remotely.
> >
> Quickly had a look at labgrid, looks like it is using similar method to
> control devices with using configuration files, in this case, target
> controller is common, with using different type of YAML config files, it
> can control different devices. (Please correct me if I'm wrong.)
>
> I think the key point of this ticket is to run Ptest on hardware devices
> automatically and make it as common as possible for Yocto. So, no matter
> using LAVA or labgrid, the devices being controlled need to be publicly
> accessible and maintained well.
>
> Do you have examples similar to this ticket can share with us?

I don't but I think it would be worth checking with Pengutronix if
they could provide some.
-- 
Otavio Salvador O.S. Systems
http://www.ossystems.com.brhttp://code.ossystems.com.br
Mobile: +55 (53) 9 9981-7854  Mobile: +1 (347) 903-9750
-- 
___
Openembedded-core mailing list
Openembedded-core@lists.openembedded.org
http://lists.openembedded.org/mailman/listinfo/openembedded-core


Re: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution of pTest with LAVA

2018-08-27 Thread Yang Wang

On 08/26/2018 10:04 PM, Otavio Salvador wrote:

On Fri, Aug 24, 2018 at 11:23 AM Wang, Yang (Young)
 wrote:

I see some good feedback from people mentioned they are working on similar 
solutions and should come out soon. We will continuously work on this ticket.

Did you consider using labgrid? It has a lot in common in current
infra we use in OEQA (ptest and like) and has nice capabilities to be
used locally and remotely.

Quickly had a look at labgrid, looks like it is using similar method to 
control devices with using configuration files, in this case, target 
controller is common, with using different type of YAML config files, it 
can control different devices. (Please correct me if I'm wrong.)


I think the key point of this ticket is to run Ptest on hardware devices 
automatically and make it as common as possible for Yocto. So, no matter 
using LAVA or labgrid, the devices being controlled need to be publicly 
accessible and maintained well.


Do you have examples similar to this ticket can share with us?

Thanks,
Yang Wang
--
___
Openembedded-core mailing list
Openembedded-core@lists.openembedded.org
http://lists.openembedded.org/mailman/listinfo/openembedded-core


Re: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution of pTest with LAVA

2018-08-26 Thread Otavio Salvador
On Fri, Aug 24, 2018 at 11:23 AM Wang, Yang (Young)
 wrote:
> I see some good feedback from people mentioned they are working on similar 
> solutions and should come out soon. We will continuously work on this ticket.

Did you consider using labgrid? It has a lot in common in current
infra we use in OEQA (ptest and like) and has nice capabilities to be
used locally and remotely.

-- 
Otavio Salvador O.S. Systems
http://www.ossystems.com.brhttp://code.ossystems.com.br
Mobile: +55 (53) 9 9981-7854  Mobile: +1 (347) 903-9750
-- 
___
Openembedded-core mailing list
Openembedded-core@lists.openembedded.org
http://lists.openembedded.org/mailman/listinfo/openembedded-core


Re: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution of pTest with LAVA

2018-08-24 Thread Wang, Yang (Young)
Hi Jair,

Thanks to let us know the background of the ticket.

I see some good feedback from people mentioned they are working on similar 
solutions and should come out soon. We will continuously work on this ticket.

Regards,
-Yang (Young)

-Original Message-
From: Gonzalez Plascencia, Jair De Jesus 
[mailto:jair.de.jesus.gonzalez.plascen...@intel.com] 
Sent: August-23-18 11:42
To: Wang, Yang (Young); Nicolas Dechesne; MacLeod, Randy
Cc: Patches and discussions about the oe-core layer; richard.pur...@intel.com; 
Anibal Limon
Subject: RE: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution 
of pTest with LAVA

Hi Yang,

I'm no longer working as part of the Yocto Project QA team since last year, so 
I haven't been following the current status of its CI and infrastructure. 
However, I would like to clarify the main intention with the original ticket 
was to track the task to automate the execution of pTest, using a public tool 
that could handle the hardware configuration steps and allowing to integrate 
them into the public Yocto Project's CI loop, while also allowing YP users to 
replicate the infrastructure for their internal platforms if they required it. 
At the moment I created the ticket, the YP team's consensus was to use LAVA as 
the tool for this task due to its features, public availability and usage 
across the industry, but the implementation and integration details with OEQA 
and Autobuilder were still being defined. Also, as far as I know, at least 
initially we would maintain our own LAVA infrastructure and device types 
separately from Linaro's board farm.

However, after stating this, I'd suggest the current team to decide the best 
path to follow either by using the LAVA system or using a different 
framework/system altogether that allows to fulfill the previously stated 
objectives.

-- Jair

> -Original Message-
> From: yang.w...@windriver.com [mailto:yang.w...@windriver.com]
> Sent: Wednesday, August 22, 2018 10:44 AM
> To: Nicolas Dechesne ;
> randy.macl...@windriver.com
> Cc: Patches and discussions about the oe-core layer  c...@lists.openembedded.org>; richard.pur...@intel.com; Gonzalez
> Plascencia, Jair De Jesus ; 
> Anibal
> Limon 
> Subject: Re: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution
> of pTest with LAVA
> 
> On 18-08-22 02:51 AM, Nicolas Dechesne wrote:
> 
> > hi,
> >
> > On Wed, Aug 22, 2018 at 4:25 AM Randy MacLeod
> >  wrote:
> >> On 08/21/2018 11:04 AM, Wang, Yang (Young) wrote:
> >>> Hi All,
> >>>
> >>> I'm working on this ticket:
> >>> https://bugzilla.yoctoproject.org/show_bug.cgi?id=12372
> >> Thanks for investigating the bug/enhancement and posting your thoughts.
> >> I'm jumping in without much expertise to try to get the ball rolling.
> >>
> >>> As far as I know, the following are all true nowadays:
> >>> - Ptest needs to be run on real hardware and it takes few hours to
> >>> finish
> >>> - Ptest can be run within OEQA, it can also be run independently
> >>> - LAVA is a good open source test  framework which:
> >>> - can manage both real hardware and different kinds of simulators as 
> >>> the
> test devices
> >>> - provides well managed logging system and test reports
> >>>
> >>> How to automatically run Ptest? I think running it with LAVA is a good
> solution, but ...
> >>>
> >>> LAVA is running as a server which can manage test jobs submitted to it, 
> >>> here
> is a typical LAVA job:
> >>> https://staging.validation.linaro.org/scheduler/job/231942/definitio
> >>> n As you can see, it defines the device type, test images which will
> >>> used, the test cases and a lot of others.
> >> That's a good clear format.
> >>
> >> I believe that what people are thinking is that we'd have:
> >>
> >> device_type: x86
> >>
> >> job_name: x86_64 oeqa
> >> ...
> >>
> >> actions:
> >> - deploy:
> >>   ...
> >>
> >> - boot:
> >> ...
> >>
> >> - test:
> >>  timeout:
> >>minutes: 2
> >>  definitions:
> >>   << some thing that makes the target and lava server wait for
> >>  oeqa to run >>
> >>name: oeqa-test
> >>
> >>> So the typical automatic way to run a test through LAVA is to write a 
> >>> script
> which use a LAVA job template, replace images with the expected ones, and
> then submit it to LAVA though a command, for example:
> >>> $ lava-tool submit-job http://@ x86_64_
> >>> job_oeqa-ptest.yaml
> > This is more or less something that we are doing as part of our CI
> > loop. The process is the following:
> >
> > 1. fetch layers updates
> > 2. make a new build for one or more $MACHINE 3. use LAVA job template
> > to generate an actual LAVA job 4. run this LAVA job on the Linaro LAVA
> > Board farm
> >
> > There is no integration into oe-core / bitbake, it is run outside of
> > the OE builds.
> This is clear, the test automation through LAVA needs to be done outside of 
> OE.
> > You can check our ptest LAVA job from our most recent build:
> > 

Re: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution of pTest with LAVA

2018-08-22 Thread Yang Wang
On 18-08-22 02:51 AM, Nicolas Dechesne wrote:

> hi,
>
> On Wed, Aug 22, 2018 at 4:25 AM Randy MacLeod
>  wrote:
>> On 08/21/2018 11:04 AM, Wang, Yang (Young) wrote:
>>> Hi All,
>>>
>>> I'm working on this ticket:
>>> https://bugzilla.yoctoproject.org/show_bug.cgi?id=12372
>> Thanks for investigating the bug/enhancement and posting your thoughts.
>> I'm jumping in without much expertise to try to get the ball rolling.
>>
>>> As far as I know, the following are all true nowadays:
>>> - Ptest needs to be run on real hardware and it takes few hours to finish
>>> - Ptest can be run within OEQA, it can also be run independently
>>> - LAVA is a good open source test  framework which:
>>> - can manage both real hardware and different kinds of simulators as 
>>> the test devices
>>> - provides well managed logging system and test reports
>>>
>>> How to automatically run Ptest? I think running it with LAVA is a good 
>>> solution, but ...
>>>
>>> LAVA is running as a server which can manage test jobs submitted to it, 
>>> here is a typical LAVA job:
>>> https://staging.validation.linaro.org/scheduler/job/231942/definition
>>> As you can see, it defines the device type, test images which will used, 
>>> the test cases and a lot of others.
>> That's a good clear format.
>>
>> I believe that what people are thinking is that we'd have:
>>
>> device_type: x86
>>
>> job_name: x86_64 oeqa
>> ...
>>
>> actions:
>> - deploy:
>>   ...
>>
>> - boot:
>> ...
>>
>> - test:
>>  timeout:
>>minutes: 2
>>  definitions:
>>   << some thing that makes the target and lava server wait for
>>  oeqa to run >>
>>name: oeqa-test
>>
>>> So the typical automatic way to run a test through LAVA is to write a 
>>> script which use a LAVA job template, replace images with the expected 
>>> ones, and then submit it to LAVA though a command, for example:
>>> $ lava-tool submit-job http://@ x86_64_ 
>>> job_oeqa-ptest.yaml
> This is more or less something that we are doing as part of our CI
> loop. The process is the following:
>
> 1. fetch layers updates
> 2. make a new build for one or more $MACHINE
> 3. use LAVA job template to generate an actual LAVA job
> 4. run this LAVA job on the Linaro LAVA Board farm
>
> There is no integration into oe-core / bitbake, it is run outside of
> the OE builds.
This is clear, the test automation through LAVA needs to be done outside
of OE.
> You can check our ptest LAVA job from our most recent build:
> https://validation.linaro.org/scheduler/job/1890442
>
> The generated LAVA job is:
> https://validation.linaro.org/scheduler/job/1890442/definition
>
> The job deals with all the flashing/management of the device to test
> (a dragonboard 820c in this specific example), so there is a bit of
> boiler plate , but the base template for running ptest can be found
> here:
>
> https://git.linaro.org/ci/job/configs.git/tree/lt-qcom/lava-job-definitions/boards/template-ptest.yaml
>
> which itself points the the LAVA job definition for ptest:
>
> https://git.linaro.org/qa/test-definitions.git/tree/automated/linux/ptest
>
> This is where LAVA communicates and manages how to run ptests and get
> status from each test.
>
> And finally... you can view the test results for this ptest run in LAVA:
>
> https://validation.linaro.org/results/1890442/0_linux-ptest
My private test automation framework is doing the similar job on daily
basis.

So there are some preconditions of doing test automation through LAVA
publicly:
1. Select a public LAVA server with an accessible account
2. Exactly know which type of hardware we would like to use, and that
has to be set in the LAVA job
    - It doesn't need to set a specific board since LAVA supports
device-types, for example:
https://validation.linaro.org/scheduler/device_types
3. Devices under this specific device-type need to be maintained by
someone, if they don't work, our tests will fail all the time

So, go back to the original ticket, we can definitely create some
scripts to run Ptest automatically through LAVA, but specific settings
of LAVA server and device-type have to be there as well. I'm concerned
if this is what the ticket reporter would like to get.

Thanks,
-Yang Wang
>> That would still work given the above oeqa job.
>>
>> No doubt there's additional glue code that would
>> be nice to write that would allow automatically creating
>> the lava yaml that boots the system into a state where oeqa
>> code takes over.
> I think most of what needs to be created is there in all the links I
> shared above. This is what we came up with , and it is not integrated
> with oeqa. But this can be used as a baseline at least.
>
>> I've never used it and only just found the code but
>> I bet that adding another controller to:
>>
>> git://git.yoctoproject.org/meta-yocto
>>
>> $ ls  meta-yocto-bsp/lib/oeqa/controllers/
>> beaglebonetarget.py  edgeroutertarget.py  grubtarget.py  __init__.py
>>
>> is what would make sense.
>>
>>> This command will return a job id 

Re: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution of pTest with LAVA

2018-08-22 Thread Nicolas Dechesne
hi,

On Wed, Aug 22, 2018 at 4:25 AM Randy MacLeod
 wrote:
>
> On 08/21/2018 11:04 AM, Wang, Yang (Young) wrote:
> > Hi All,
> >
> > I'm working on this ticket:
> > https://bugzilla.yoctoproject.org/show_bug.cgi?id=12372
>
> Thanks for investigating the bug/enhancement and posting your thoughts.
> I'm jumping in without much expertise to try to get the ball rolling.
>
> >
> > As far as I know, the following are all true nowadays:
> > - Ptest needs to be run on real hardware and it takes few hours to finish
> > - Ptest can be run within OEQA, it can also be run independently
> > - LAVA is a good open source test  framework which:
> > - can manage both real hardware and different kinds of simulators as 
> > the test devices
> > - provides well managed logging system and test reports
> >
> > How to automatically run Ptest? I think running it with LAVA is a good 
> > solution, but ...
> >
> > LAVA is running as a server which can manage test jobs submitted to it, 
> > here is a typical LAVA job:
> > https://staging.validation.linaro.org/scheduler/job/231942/definition
> > As you can see, it defines the device type, test images which will used, 
> > the test cases and a lot of others.
>
> That's a good clear format.
>
> I believe that what people are thinking is that we'd have:
>
> device_type: x86
>
> job_name: x86_64 oeqa
> ...
>
> actions:
> - deploy:
>   ...
>
> - boot:
> ...
>
> - test:
>  timeout:
>minutes: 2
>  definitions:
>   << some thing that makes the target and lava server wait for
>  oeqa to run >>
>name: oeqa-test
>
> >
> > So the typical automatic way to run a test through LAVA is to write a 
> > script which use a LAVA job template, replace images with the expected 
> > ones, and then submit it to LAVA though a command, for example:
> > $ lava-tool submit-job http://@ x86_64_ 
> > job_oeqa-ptest.yaml

This is more or less something that we are doing as part of our CI
loop. The process is the following:

1. fetch layers updates
2. make a new build for one or more $MACHINE
3. use LAVA job template to generate an actual LAVA job
4. run this LAVA job on the Linaro LAVA Board farm

There is no integration into oe-core / bitbake, it is run outside of
the OE builds.

You can check our ptest LAVA job from our most recent build:
https://validation.linaro.org/scheduler/job/1890442

The generated LAVA job is:
https://validation.linaro.org/scheduler/job/1890442/definition

The job deals with all the flashing/management of the device to test
(a dragonboard 820c in this specific example), so there is a bit of
boiler plate , but the base template for running ptest can be found
here:

https://git.linaro.org/ci/job/configs.git/tree/lt-qcom/lava-job-definitions/boards/template-ptest.yaml

which itself points the the LAVA job definition for ptest:

https://git.linaro.org/qa/test-definitions.git/tree/automated/linux/ptest

This is where LAVA communicates and manages how to run ptests and get
status from each test.

And finally... you can view the test results for this ptest run in LAVA:

https://validation.linaro.org/results/1890442/0_linux-ptest

>
> That would still work given the above oeqa job.
>
> No doubt there's additional glue code that would
> be nice to write that would allow automatically creating
> the lava yaml that boots the system into a state where oeqa
> code takes over.

I think most of what needs to be created is there in all the links I
shared above. This is what we came up with , and it is not integrated
with oeqa. But this can be used as a baseline at least.

>
> I've never used it and only just found the code but
> I bet that adding another controller to:
>
> git://git.yoctoproject.org/meta-yocto
>
> $ ls  meta-yocto-bsp/lib/oeqa/controllers/
> beaglebonetarget.py  edgeroutertarget.py  grubtarget.py  __init__.py
>
> is what would make sense.
>
> > This command will return a job id (take #231942 as an example), and then 
> > the script can get all logs and reports based on LAVA server address and 
> > this job id, for example:
> > - execution log: https://staging.validation.linaro.org/scheduler/job/231942
> > - test report: 
> > https://staging.validation.linaro.org/results/231942/0_smoke-tests
>
> Suspect that this is were the design intent diverges.
>
> Usually lava runs the whole system, and I think we just
> want it to manage the hardware and then step out of the way.
> There'd likely be an api to allow oeqa and lava to communicate
> so that for example oeqa could tell lava that the tests were done.

Yes, LAVA runs the whole system. Including management of devices to
test, reboot, flashing.. It also has a LAVA test definition format
that must be used. So to benefit from LAVA, a LAVA instance must be
setup, and then we need to have lab instances where boards are
attached to. A LAVA instance can have several labs, and labs can be
spread physically. LAVA must know how to deal with each
hardware/machine (e.g. how to power it on, get a serial console). The

Re: [OE-core] [RFC] Yocto Project Bug 12372 - Automate the execution of pTest with LAVA

2018-08-21 Thread Randy MacLeod

On 08/21/2018 11:04 AM, Wang, Yang (Young) wrote:

Hi All,

I'm working on this ticket:
https://bugzilla.yoctoproject.org/show_bug.cgi?id=12372


Thanks for investigating the bug/enhancement and posting your thoughts.
I'm jumping in without much expertise to try to get the ball rolling.



As far as I know, the following are all true nowadays:
- Ptest needs to be run on real hardware and it takes few hours to finish
- Ptest can be run within OEQA, it can also be run independently
- LAVA is a good open source test  framework which:
- can manage both real hardware and different kinds of simulators as the 
test devices
- provides well managed logging system and test reports
  
How to automatically run Ptest? I think running it with LAVA is a good solution, but ...
  
LAVA is running as a server which can manage test jobs submitted to it, here is a typical LAVA job:

https://staging.validation.linaro.org/scheduler/job/231942/definition
As you can see, it defines the device type, test images which will used, the 
test cases and a lot of others.


That's a good clear format.

I believe that what people are thinking is that we'd have:

device_type: x86

job_name: x86_64 oeqa
...

actions:
- deploy:
 ...

- boot:
...

- test:
timeout:
  minutes: 2
definitions:
 << some thing that makes the target and lava server wait for
oeqa to run >>
  name: oeqa-test

  
So the typical automatic way to run a test through LAVA is to write a script which use a LAVA job template, replace images with the expected ones, and then submit it to LAVA though a command, for example:

$ lava-tool submit-job http://@ x86_64_ job_oeqa-ptest.yaml


That would still work given the above oeqa job.

No doubt there's additional glue code that would
be nice to write that would allow automatically creating
the lava yaml that boots the system into a state where oeqa
code takes over.

I've never used it and only just found the code but
I bet that adding another controller to:

git://git.yoctoproject.org/meta-yocto

$ ls  meta-yocto-bsp/lib/oeqa/controllers/
beaglebonetarget.py  edgeroutertarget.py  grubtarget.py  __init__.py

is what would make sense.


This command will return a job id (take #231942 as an example), and then the 
script can get all logs and reports based on LAVA server address and this job 
id, for example:
- execution log: https://staging.validation.linaro.org/scheduler/job/231942
- test report: 
https://staging.validation.linaro.org/results/231942/0_smoke-tests


Suspect that this is were the design intent diverges.

Usually lava runs the whole system, and I think we just
want it to manage the hardware and then step out of the way.
There'd likely be an api to allow oeqa and lava to communicate
so that for example oeqa could tell lava that the tests were done.

All lava would know is that an oeqa test ran and it's completion
status.


So, as far as I can tell, it may not be an appropriate way to integrate LAVA 
test into a bitbake command as we run it with simple test harness, LAVA is an 
advanced test framework and it manages all jobs submit to it well.
  
Please comment if you have better idea about this ticket.


I'm really going on a few conversations that I've had or chats
on IRC so hopefully someone else can step up and comment on both Young's
initial email and my interpretation of where we're trying to
get to.

Thanks,

--
# Randy MacLeod
# Wind River Linux
--
___
Openembedded-core mailing list
Openembedded-core@lists.openembedded.org
http://lists.openembedded.org/mailman/listinfo/openembedded-core