On Mon, Dec 24, 2012 at 08:22:17PM -0500, Alex Huang wrote:
> Rohit & Prasanna,
> 
> I thought about this over the weekend a little bit.  I think testing
> the entire range of commands really is an automated system test.  I
> think adding to current set of automated tests is fine but it really
> serves no purpose in testing the changes in this api refactoring.
> 
> What you really changed is the processing loop and moving code out
> to adapters.  The right thing to do is actually to write a test case
> that tests that main loop and also test drivers for the security
> adapters.  So test drivers here is different than unit tests.  Unit
> tests understands how your code works and you can write that too but
> test drivers loads code specified and tests the interface to make
> sure in all situations the contract is upheld.

I sort of agree. The simulator might be heavy-duty for something
simple like this. And the refactoring doesn't really change the APIs
at all. It only applies custom annotations for us to do efficient
lookups while allowing pluggability of adapters.

Right now with the simulator the flow looks like this:

otw request -> API server -> mgmt layers -> ...simulator... -> mgmt layers -> 
response builder -> client

FWIU,  the test driver will basically escape the entire mgmt and agent
layers. So we have:

request -> API server -> test driver w/ data -> response builder

Am I right in understanding that this will basically involve us
seeding enough data for our test driver's database with the valid
JSON, request-responses? That could save us a ton of time over the
simulator+marvin tests.

-- 
Prasanna.,

Reply via email to