On Tue, Mar 12, 2013 at 12:09:20AM +0530, Edison Su wrote:
> Thanks!!! It's a big step forward to get CI work.

Thanks! It is indeed. Hope we can keep it up and running without downtime.

> I have few questions regarding to marvin test though:
> Where do we put the marvin unit test cases? I can't find them on
> master branch.

All marvin tests can be found under test/integration. In there is
smoke and component. Smoke is the set of tests we use as a bvt to
guarantee the build is testable. Although this test set is heavily
outdated, it is usable. In component are tests written for specific
features. These are used for regression. There are some suites waiting
on ipclearance which include fixes and new tests for both smoke and
component.

> I am trying to bring up the marvin test against
> simulator and devcloud(I think the smoke test already did that), we
> may need just one maven command to setup mgt server and simulator on
> developer's environment, and one maven command to run marvin test
> cases against simulator. 

1) Simulator:

As Alex had proposed I got the simulator working this weekend. The
steps from that email for running the simulator are as follows:

Build:
$ mvn -Pdeveloper clean install
$ mvn -Pdeveloper -pl developer -Ddeploydb
$ mvn -Pdeveloper -pl developer -Ddeploydb-simulator

Start mgmt server:
$ mvn -pl client jetty:run

Deploy Adv Zone and test for systemVMs health:
$ mvn -Pdeveloper,marvin -Dmarvin.config=`find . -name simulator.cfg` -pl 
:cloud-marvin test

At this point there is an NPE from the storage refactor code that I would like
help with resolving.

We should then add a property to the test lifecycle of the above mvn profile to
allow including additional test suites from the test/integration/smoke 
directory.

2) Devcloud:
For devcloud we can run a smaller version of the smoke test. This
needs to be included in the devcloud profile in maven similar to the simulator
commands above. Would you be working on the regular devcloud or Marcus'
devcloud-kvm? devcloud-kvm is something that is missing now, so we need to set
that up.

Both the simulator and the devcloud tests will then have to be setup on
jenkins to run per checkin. For this we need a nexus proxy as the current
performance of devcloud-ci is poor. It takes 7m to just do a mvn clean install
(after skipping tests). That can also improve our build speeds by a good
magnitude. Once in place, I'll alter the CI image to use the proxy. After that
we have to set this to go against every feature branch to prevent broken
code from coming into master. Sonatype has some terse documentation for me to
read. If someone knows how to setup the nexus stuff I'll buy them a (cask of?)
beer :)

> The benefits of using simulator:
> 1. Easy to setup
> 2. Easy to scale, ten of thousands hypervisor hosts and VMs are not a problem 
> at all.
> 3. Easy to inject errors in simulator(it should be, there is API to inject 
> errors in simulator resource code)

Agree - it's a good tool to test business logic - planners, apis, dbaccess,
deadlocks, performance, orchestration. The API is configureSimulator that
you're talking about? Should we have a resetSimulator API also? An API that can
perhaps do playback of the last action?

> Developers should test against simulator before test your features
> and bug fixes on real hardware, if the features and bug fixes are
> not related to specific hardware.
> The challenges we have(may not specific to test against simulator): 
> 1. How to write a test case regardless of the test environment? The
> same test case should be work with different test setup, no matter
> it's simulator or xenserver, vmware etc.

To test the integrated setup of the simulator and/or other hw then one should
write tests involving only API steps: 

1. No remote ssh access
2. No assumed templates beyond the default
3. No bash/ telnet/ stuff

I've published these and other guidelines on our wiki sometime ago:
https://cwiki.apache.org/confluence/display/CLOUDSTACK/Testing+with+Python#TestingwithPython-Guidelinestochoosescenariosforintegration

Almost all of our integration tests are marked with the attribute on which
environment they can run. You can discover those that can run on the simulator 
thusly:
$ nosetests --with-marvin --marvin-config=tools/devcloud/devcloud.cfg -w 
test/integration/component --load -a tags='simulator' --collect-only. 

More info about the tags used on tests is here:
https://cwiki.apache.org/confluence/display/CLOUDSTACK/Testing+with+Python#TestingwithPython-MarvinNosePlugin

But - to run the current smoke tests across environments esp. simulator will 
require
a bit of code movement (aka refactor :)). Like removing backend verification or
rewriting the same test for the simulator.

> 2. How to write a test case just specific to a test environment and
> let the test framework know the requirement? We have so many test
> environment combinations, e.g. advanced zone, advanced zone with all
> complicated network setup, basic zone, multiple clusters, multiple
> zones etc, etc. We may need to standardize all the test environment
> combinations, category them, so the test case can be categorized to
> one specific test environment.

Some of this pain is alleviated through the use of tags. I made tags part of
our test plan template but not sure if it's still there. Intent was implied, I
think we have to be more explicit about these.

I map the environment currently in the CI run against these tags and pick only
the tests that can be run against it.


> 3. How to make write marvin test easier for developer and QA? I know
> you are working on it on the marvin refactor branch, do you have a
> target, what you are trying to achieve in short term or long term?
> Or what's the feedbacks/complains from people actually writing
> marvin unit test before?

I gathered the following feedback from the qa team:
1) Still too much typing and coding effort to write down a scenario.
2) Library for integrating a test is confusing. Needs improvement.
3) Debugging while writing test scenario should be simpler along with
integrating into eclipse.

The new form of marvin test after marvin-refactor should be like this:

class MyFeatureSuite(cloudstackTestCase):

    @attr('tags=simulator,xenserver..')
    def TestMyFeatureScenario(self):
        af = UserAccountFactory()
        vm = VirtualMachineFactory(af) #Deploy VM in user account
        nw = StaticNatFactory(af.account, vm.id)
        result = vm.list(arg1=val1)
        self.assertEqual(result, 'expectedResult', msg='Test Failed')

Factory objects for this and the refactored base library can be found in
integration.lib.factory and integration.lib.base. I'm using factory_boy [1] for
this. Was stuck with an issue last week [2] that is now resolved. 

In the short term:
I will write up basic scenarios to illustrate the use of the factories.
Explain how a factory can be written and its inheritance defined.

In the medium term:
I'll write factories (and related inheritance) for all the cloudstack objects.
Expecting a hand with this.

In the long term:
Based on the factory work we can extend the framework to allow a DSL test case

In the near distant future:
All this is connected up with infrastructure so tests can run in a continuous
fashion across feature branches as and when tests are checked in, branches are
merged etc.

In the far future:
Merge test-case writing/test-execution into just writing either the dsl/ python
code for tests. No more lugging around excel sheets! Let's get that info from
python docstrings
 
[1] http://github.com/rbarrois/factory_boy/
[2] https://github.com/rbarrois/factory_boy/issues/46

-- 
Prasanna.,

Reply via email to