Even if we include the unit tests, there's no void main function that
will trigger the tests, the configuration loads from within the jar,
not from a user definable location, and running junit tests from
within your own app is a bit tricky (unless we know we're never going
to add another test ever again, thus the reflection).

On Mon, Jun 3, 2013 at 9:47 AM, Kurt T Stam <[email protected]> wrote:
> Maybe I'm missing the point, but why can't they run the way they are now?
> All we have to do is to add the uddi-tck-test.jar, which for omitted by
> mistake..
> No?
>
> Cheers,
>
> --Kurt
>
>
> On 6/2/13 12:57 PM, Alex O'Ree wrote:
>>
>> Relevant Tickets:
>> JUDDI-314 Create a juddi-client-bundle-3.0.0 with jar, source and
>> javadocs for juddi-client and uddi-ws
>> JUDDI-583 Productize the TCK test suite
>>
>> I'm attempting to formulate a plan to turn the UDDI TCK into both a
>> testing platform for jUDDI (as it is now) and be able to run the test
>> suites as a standalone program (without requiring a full checkout).
>>
>> Currently, all Unit Test cases (/src/test) are within uddi-tck, and
>> all setup and configure the code is in uddi-tck-base (/src/main)
>>
>>
>> In order to facilitate this change, I've came up with an idea and was
>> wondering if anyone else had a better one before I devote time and
>> effort into it.
>> 1) Use reflection to identify all classes with test cases from
>> uddi-tck, then use JUnitCore to execute them. In addition, rework the
>> configuration loading bits to load files from disk instead of from
>> within the jar file. This requires the test classes (src/test) to be
>> included in the udid-tck jar file.
>>
>> 2) Refactor all existing test cases to uddi-tck/src/main and rework
>> the actually uddi-tck/src/test classes to call the code from src/main.
>> I only think this should be required if for some reason the test
>> classes can't be included with the tck jar file see (JUDDI-314). Then
>> use some kind of reflection to find all test cases and execute them.
>>
>>
>> In either case, it would be nice to have a formatted xml output which
>> identifies all the tests cases that failed and the relevant output.
>> Similar to the surefire test reports, but more user friendly.
>
>

Reply via email to