Dne 14.9.2016 v 18:46 Marcos E. Matsunaga napsal(a):
As I mentioned in my response, currently it's not ordered, but we can discuss and make it alphabetical. It's just a matter of adding one `sorted` into a proper place (as it's iterative it'd be multiple-line fix, but still relatively simple). To not forget let's track it here:Hi Lucas,Thanks for your answers. I have some comments below. And forgive me if I am repeating something. On 14/09/16 16:59, Lucas Meneghel Rodrigues wrote:On Wed, Sep 14, 2016 at 8:32 AM Marcos E. Matsunaga <marcos.matsun...@oracle.com <mailto:marcos.matsun...@oracle.com>> wrote: Hi Folks, I have some questions about how avocado works. 1. If I run avocado and give it a directory that has all tests. Is there a way to specify the order of execution? I mean, if I name the files 001-xxx.py, 010-aa.py, will it execute 001-xxx.py before 010-aa.py or it doesn't follow an alphabetical order? There is - You can specify their order of execution in the command line: avocado run failtest.py raise.py doublefree.py JOB ID : 6047dedc2996815659a75841f00518fa0f83b1ee JOB LOG : /home/lmr/avocado/job-results/job-2016-09-14T12.53-6047ded/job.log TESTS : 3 (1/3) failtest.py:FailTest.test: FAIL (0.00 s) (2/3) raise.py:Raise.test: PASS (0.11 s) (3/3) doublefree.py:DoubleFreeTest.test: PASS (1.02 s) RESULTS : PASS 2 | ERROR 0 | FAIL 1 | SKIP 0 | WARN 0 | INTERRUPT 0 TESTS TIME : 1.13 s JOB HTML : /home/lmr/avocado/job-results/job-2016-09-14T12.53-6047ded/html/results.htmlYeah.. That would work if you want to run just a set of the tests, but think about a specific test that has hundreds of individual tests. If you can just give the directory and it sort it alphabetically before executing, that would be great. And with multiplex auto discovery, that would be even better.
https://trello.com/c/whLgcvtO/827-consider-sorting-the-fileloader-s-test-discovery
Running the jobs in parallel is tricky. It always depends on what do you want to do and where and creating one universal method is near to impossible (it'd be very complex with lots of options). Let's wait for the Job API and custom runner scripts to allow such behavior.2. Lets take into consideration that same directory. Some of the scripts will have multiplex configuration files. Does avocado automatically look at some specific directory for those multiplex configuration files? I've tried to add them to the data, cfg and even the <script>.data directories, but it seems that it doesn't look for them automatically, only when I specify the option --multiplex, but then, the file will be used by all scripts and I was only able to specify a single multiplex file. The original design assumption was that you'd execute only one test that has a multiplex file, and provide the multiplex file with it, so indeed what you wan't to do can't be done right now. I suppose multiple tests with multiplex files and multiplex file auto detection would be a nice feature to add moving forward, though.With a small standard definition, I think multiplex auto-discovery could be simple to implement. Let's say you have a <some dir>/test.py and when you execute it, check under <some dir>/test.py.data for a test.yaml. If it is there, just open it and use. And, of course, you could add a default directory for multiplex files in avocado.conf and follow some hierarchy. Speaking of multiplex, when I was thinking about starting parallel jobs on different hosts, I came up with something like this in a multplex file: 1 perf: 2 SetUp: 3 packages = 'perf netperf' 4 server: 5 hostname = 'perf1' 6 ipaddress = '10.196.50.101' 7 userid = 'root' 8 client: 9 hostname = 'perf2' 10 ipaddress = '10.196.50.102' 11 userid = 'root' I tried every combination I could to try to retrieve the two hostnames, but I was never able to do it. Wouldn't be a good idea to be able to retrieve the information from the multiplex file as "self.params.get('/perf/server/hostname')" ? That way, you could list all your remote clients (in this case) and be able to do whatever necessary to run jobs in parallel.
To demonstrate the complexity, you might want to run jobs in parallel: 1. the same test on multiple machines 2. use queues and distribute the tests to the machines in any order 3. create a server and run several clients ...
3. I tried to find if it was possible to start multiple test processes in parallel, but it seems that avocado doesn't have anything like that. Lets say I have 4 guests and I want to execute performance tests while loading the 4 guests and I want to start tests on all 4 guests at the same time. It doesn't have a feature that will do that, right? Executing tests in multiple remote machines was something that we din't think about either, and one that would be very interesting to add indeed.Yes.. That would be really interesting.What you can do for now is to start 4 parallel separate instances of avocado, one for each machine. Each execution will create its own job directory, though.I don't know if you are familiar with OpenMPI, but it does exactly that. It manage to start tests simultaneously on many hosts (it doesn't have to be guests). I think a functionality similar to that would be great. Avocado already copy the job to the remote host and retrieve the result.
The current limitation is that the remote runner runs full jobs and there is currently no way to merge results of the jobs. With the job APIs this problem should be addressed so let's wait for that. Also note that the Job API is very complex task and probably will be implemented incrementally. I'd not expect it to be ready in one month...
Thanks for your time and help. Cheers!
Regards, Lukáš
-- Regards, Marcos Eduardo Matsunaga Oracle USA Linux Engineering “The statements and opinions expressed here are my own and do not necessarily represent those of Oracle Corporation.”
signature.asc
Description: OpenPGP digital signature