I was planning on waiting a little bit longer, just a few things
to clean-up, but I have a test tool that I was hoping to make
available on Apache.  The concept of it is actually really simple
and I've used an old C version of it on quite a few different projects
so I think it versatile enough that it might be of some use.
The basic idea behind it's testing scheme is that in order to
test something you want to run a command, check it's return-code,
and then diff some output.  Obviously that there are lots of times
when more is needed but that's the basic premise.  Building on top
of this you can do things like:
  - specify a series of commands and diffs to run within a testcase
  - specify the return code (or the comparison operator and value)
  - specify that a command was supposed to trap
  - specify that output files should be sent through a masking
    program before diffing (to mask things like timestamps)
  - group testcases in any manner to help manage them
  - specify the input (stdin) for the program
  - specify a list of commands to run before and after the testcase
    to do set-up or clean-up
  - specify a list of commands to run on success or failure of the
    testcase
  - set/unset environment variables locally to a testcase

Right now the version I was hoping to put into Apache is written
is Java and uses an XML file to specify how to run the testcases.
I'm hoping to have a C version soon.  Also, the program will be
extendable so that if you want to test something that isn't a
command-line program but rather something that requires specialized
logic (ie. doing HTTP GET or POST) you'll be able to define new
types of tasks.  Right now it supports 2 different types of
tasks, command line programs and HTTP requests (get/post).

What I think is nice about this is that it's written in Java,
so it's portable (I hope 8-)  and it doesn't require you to write
code (or hooks into your project) to test your product.

Anyone think this might be of value?

-Dug



Please respond to [EMAIL PROTECTED]

To:   [EMAIL PROTECTED], [EMAIL PROTECTED]
cc:   [EMAIL PROTECTED], [EMAIL PROTECTED],
      [EMAIL PROTECTED], [EMAIL PROTECTED]
Subject:  Test Infrastructure Project Proposal




Ross Burton <[EMAIL PROTECTED]> wrote:
> Having spent all day upto my arm pits in bits of cocoon, xerces and
> xalan, I've come to the conclusion that tracking down bugs in bits of
> cocoon can be nearly impossible at the moment. How would you guys feel
> about us starting to use JUnit to run unit tests over cocoon?

First, sorry for the large cross-posting.  But I wanted to make sure this
note is received by a large audience.  Replies should be sent only to
[EMAIL PROTECTED] (I'm not subscribed to [EMAIL PROTECTED]).

Xalan being a project that 1) requires a *lot* of testing, and 2) is
sandwiched inbetween Cocoon and Xerces, 3) is dependent on other
technologies like BSF, and 4) is used in several other pipeline scenarios,
we (i.e. the folks at Lotus who are involved in the Xalan project) have
been doing a lot of thinking about this subject.

What is happening in the XML/Web world is the integration of a lot of
smaller components, plugged together via (hopefully) standard interfaces.
This increases the need for unit testing, and integration testing in a big
way.  In systems such as we are building, robustness is everything, and
fragility is becoming an increasing problem.  I feel this is probably the
most critical issue xml.apache.org and the Jakarta projects are facing,
even above performance issues.  The days have ended when Xerces can release
without testing with Xalan, and Xalan can release without testing with
Cocoon, etc.

Also, I feel what we are all practising is pretty close to "Extreme
Programming" (http://www.extremeprogramming.org/), by our very motto of
"release early and often".  Extreme programming is very reliant on having a
*lot* of tests that are constantly run (see
http://www.extremeprogramming.org/rules/unittests.html and
http://www.extremeprogramming.org/rules/functionaltests.html).  This means
that tests must be very fast to create, easy to plug in, easy to have them
become part of the perminant acceptence tests, running the tests must be
extremely convenient, and diagnosing problems from the reports must also be
easy.  And when a bug is found by a user, a test should almost always be
added to the acceptence tests
(http://www.extremeprogramming.org/rules/bugs.html).

We have analyzed JUnit and feel it doesn't address our needs, nor the
integration testing needs, though we like it's Ant base.

I propose a project for testing infrastructure, that covers the needs of
unit testing, stress testing, performance testing, negative testing (i.e.
testing of error conditions), integration testing, and error logging and
reporting.  I don't know or care if this is a Jakarta project or an
xml.apache.org project.  I believe the Jakarta project has been thinking
somewhat along these lines?

The Xalan project already has a fair amount of infrastructure that we would
be happy to contribute.  But basically, I think we should start first with
requirements, then a schema for reporting (i.e. design the data first), go
next to interfaces, and then decide what existing code can be used.

Thoughts?  Does anyone want to -1 this?  If not, where should it live? (I
suspect the answer is in Jakarta, next to Ant).  What should it be named?
What are the next steps?  Who should be the founders?  And what about
C-language integration testing, as well as other languages (which might
argue against a home in Jakarta?)?

Again, sorry for the large cross-posting, but I think it's time for this
issue to get full attention from all the projects.

-scott





                    Ross Burton
                    <ross.burton@        To:     [EMAIL PROTECTED]
                    mail.com>            cc:     (bcc: Scott
Boag/CAM/Lotus)
                    Sent by:             Subject:     Re: [C2] Unit
testing.
                    ross@itzinter
                    active.com


                    02/10/2001
                    09:37 AM
                    Please
                    respond to
                    cocoon-dev






Paul Russell wrote:
>
> Guys,
>
> Having spent all day upto my arm pits in bits of cocoon, xerces and
> xalan, I've come to the conclusion that tracking down bugs in bits of
> cocoon can be nearly impossible at the moment. How would you guys feel
> about us starting to use JUnit to run unit tests over cocoon? That way
> we can be a lot more confident that we didn't just break something when
> we make a change? The tests can range from checking that a matcher
> matches, to checking that a classloader class loads, to checking that
> the output of a particular pipeline is as expected. I'm very happy to
> help in this respect, and I have minions (eh, boss?) that will put some
> work into developing tests, too. How do you guys feel about this?
>
> Does anyone know whether the IBM Public Licence is APL compatible, or
> would we have to keep JUnit out of the repository?

A small note - didn't the Avalon list seperate their testing code from
the core and put it on Sourceforge?  arrowhead.sourceforge.net IIRC.

Ah - just went there.  Yes, that is the site of the code (Kevin Burton
is the developer) but there is no downloads, no web page, no nothing.
Anyone know what happened?  Did it get dropped?

Apart from that, +1 for the tests.

Ross Burton

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]






---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]






---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to