Re: [testing] metadata approach
Oliver Deakin wrote: Paulex Yang wrote: Just a wild thought, because TestNG support both jre142 and jdk5, so there must be some way to make it run with annotation but without concurrent, just have a look at the layout of TestNG[1] source code from its v4.1 release, seems if we replace the src/jdk15/org/testng/internal/thread/*.java with src/jdk14/org/testng/internal/thread/*.java, and rebuild it, there is chance to create a customized version based edu.emory.mathcs.util.concurrent as workaround. If no one objection(say, legal consideration), I can try this thought. That's interesting - I don't know about the legal considerations, but I'd like to hear how your experimentation goes! Hello Oliver, I have tried Paulex's solution, and I can launch simple TestNG test using HarmonyVM. But I doubt if this is necessary. Maybe we will have concurrent soon :-) Nathan, could you share some information about concurrent? Best regards, Richard Regards, Oliver [1] http://testng.org/testng-4.1.zip Alexei Zakharov wrote: Hi Oliver, But is j.u.c actually required to be in the runtime under test? I was thinking that j.u.c was only required for the VM actually running the harness, and all that gets run on the VM under test is the actual test method. If this was true, then we could run TestNG with the RI (which has j.u.c) and use the jvm option to specify the Harmony VM (which would not need j.u.c). I afraid we cannot do like that. At least I was not successful last time I tried to run tests using the jvm=harmony option. As far as I understand TestNG requires j.u.c for running every single test method because parallel running can be specified on the method level. I mean in TestNG you can write something like this: @Test(threadPoolSize = 7, invocationCount = 29) This means that this method should be invoked from different threads. And it seems that TestNG needs j.u.c to implement multithreading. Yes agreed, it is good to make group membership explicit as it facilitates inclusion/exclusion of groups, and makes it obvious which group tests belong to. Perhaps the same should be done for api tests, so we have a type.api group? So you suggest to add @Test (groups={os.any, type.api}) to every api test (that runs on every platform) without any defaults at all? I thought I had sent a mail out on this in the original thread, but I guess I never did (unless Thunderbird is hiding mail from me again!). Just checked - there is no such mail in my gmail box, at least in the [classlib] Testing conventions - a proposal thread. So, for example, if we were on a Windows x86 32bit machine, the Ant scripts would run test groups os.shared, os.windows, os.windows.x86 and (if we ever get any 32/64bit specific tests) os.windows.x86.32bit, or something similar. Well, I like it in general. The only thing I still feel uncomfortable with is os.shared. When some code is shared among different platforms it makes sense. But here we have a test shared by several OSes. Does this sound natural? But I don't feel strongly about that and will not object if everybody likes this. With Best Regards, -- Richard Liang China Software Development Lab, IBM - Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
RE: [testing] metadata approach
-Original Message- From: Richard Liang [mailto:[EMAIL PROTECTED] Oliver Deakin wrote: Paulex Yang wrote: Just a wild thought, because TestNG support both jre142 and jdk5, so there must be some way to make it run with annotation but without concurrent, just have a look at the layout of TestNG[1] source code from its v4.1 release, seems if we replace the src/jdk15/org/testng/internal/thread/*.java with src/jdk14/org/testng/internal/thread/*.java, and rebuild it, there is chance to create a customized version based edu.emory.mathcs.util.concurrent as workaround. If no one objection(say, legal consideration), I can try this thought. That's interesting - I don't know about the legal considerations, but I'd like to hear how your experimentation goes! Hello Oliver, I have tried Paulex's solution, and I can launch simple TestNG test using HarmonyVM. But I doubt if this is necessary. Maybe we will have concurrent soon :-) Nathan, could you share some information about concurrent? We're getting there, at least for DRLVM. Once I get the code to be part of the build DRLVM should support all of the atomics-dependant code. The LockSupport is still a bit in flux. -Nathan Best regards, Richard - Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [testing] metadata approach
Paulex Yang wrote: Just a wild thought, because TestNG support both jre142 and jdk5, so there must be some way to make it run with annotation but without concurrent, just have a look at the layout of TestNG[1] source code from its v4.1 release, seems if we replace the src/jdk15/org/testng/internal/thread/*.java with src/jdk14/org/testng/internal/thread/*.java, and rebuild it, there is chance to create a customized version based edu.emory.mathcs.util.concurrent as workaround. If no one objection(say, legal consideration), I can try this thought. That's interesting - I don't know about the legal considerations, but I'd like to hear how your experimentation goes! Regards, Oliver [1] http://testng.org/testng-4.1.zip Alexei Zakharov wrote: Hi Oliver, But is j.u.c actually required to be in the runtime under test? I was thinking that j.u.c was only required for the VM actually running the harness, and all that gets run on the VM under test is the actual test method. If this was true, then we could run TestNG with the RI (which has j.u.c) and use the jvm option to specify the Harmony VM (which would not need j.u.c). I afraid we cannot do like that. At least I was not successful last time I tried to run tests using the jvm=harmony option. As far as I understand TestNG requires j.u.c for running every single test method because parallel running can be specified on the method level. I mean in TestNG you can write something like this: @Test(threadPoolSize = 7, invocationCount = 29) This means that this method should be invoked from different threads. And it seems that TestNG needs j.u.c to implement multithreading. Yes agreed, it is good to make group membership explicit as it facilitates inclusion/exclusion of groups, and makes it obvious which group tests belong to. Perhaps the same should be done for api tests, so we have a type.api group? So you suggest to add @Test (groups={os.any, type.api}) to every api test (that runs on every platform) without any defaults at all? I thought I had sent a mail out on this in the original thread, but I guess I never did (unless Thunderbird is hiding mail from me again!). Just checked - there is no such mail in my gmail box, at least in the [classlib] Testing conventions - a proposal thread. So, for example, if we were on a Windows x86 32bit machine, the Ant scripts would run test groups os.shared, os.windows, os.windows.x86 and (if we ever get any 32/64bit specific tests) os.windows.x86.32bit, or something similar. Well, I like it in general. The only thing I still feel uncomfortable with is os.shared. When some code is shared among different platforms it makes sense. But here we have a test shared by several OSes. Does this sound natural? But I don't feel strongly about that and will not object if everybody likes this. With Best Regards, -- Oliver Deakin IBM United Kingdom Limited - Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [testing] metadata approach
Just a wild thought, because TestNG support both jre142 and jdk5, so there must be some way to make it run with annotation but without concurrent, just have a look at the layout of TestNG[1] source code from its v4.1 release, seems if we replace the src/jdk15/org/testng/internal/thread/*.java with src/jdk14/org/testng/internal/thread/*.java, and rebuild it, there is chance to create a customized version based edu.emory.mathcs.util.concurrent as workaround. If no one objection(say, legal consideration), I can try this thought. [1] http://testng.org/testng-4.1.zip Alexei Zakharov wrote: Hi Oliver, But is j.u.c actually required to be in the runtime under test? I was thinking that j.u.c was only required for the VM actually running the harness, and all that gets run on the VM under test is the actual test method. If this was true, then we could run TestNG with the RI (which has j.u.c) and use the jvm option to specify the Harmony VM (which would not need j.u.c). I afraid we cannot do like that. At least I was not successful last time I tried to run tests using the jvm=harmony option. As far as I understand TestNG requires j.u.c for running every single test method because parallel running can be specified on the method level. I mean in TestNG you can write something like this: @Test(threadPoolSize = 7, invocationCount = 29) This means that this method should be invoked from different threads. And it seems that TestNG needs j.u.c to implement multithreading. Yes agreed, it is good to make group membership explicit as it facilitates inclusion/exclusion of groups, and makes it obvious which group tests belong to. Perhaps the same should be done for api tests, so we have a type.api group? So you suggest to add @Test (groups={os.any, type.api}) to every api test (that runs on every platform) without any defaults at all? I thought I had sent a mail out on this in the original thread, but I guess I never did (unless Thunderbird is hiding mail from me again!). Just checked - there is no such mail in my gmail box, at least in the [classlib] Testing conventions - a proposal thread. So, for example, if we were on a Windows x86 32bit machine, the Ant scripts would run test groups os.shared, os.windows, os.windows.x86 and (if we ever get any 32/64bit specific tests) os.windows.x86.32bit, or something similar. Well, I like it in general. The only thing I still feel uncomfortable with is os.shared. When some code is shared among different platforms it makes sense. But here we have a test shared by several OSes. Does this sound natural? But I don't feel strongly about that and will not object if everybody likes this. With Best Regards, -- Paulex Yang China Software Development Lab IBM - Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [testing] metadata approach
Alexei Zakharov wrote: Hi Oliver, So perhaps the build system should be changed temporarily so that we dont self host the test harness? i.e. until we get java.util.concurrent, run Ant and the subsequent TestNG process with RI or other non-Harmony VM, and launch the tests with Harmony VM using the jvm option. The bad news is that TestNG requires j.u.c stuff even for single test execution (i.e. in any case if jvm=path to harmony). :( So if we want to run annotated tests with TestNG (even from the command line) we need j.u.c. But is j.u.c actually required to be in the runtime under test? I was thinking that j.u.c was only required for the VM actually running the harness, and all that gets run on the VM under test is the actual test method. If this was true, then we could run TestNG with the RI (which has j.u.c) and use the jvm option to specify the Harmony VM (which would not need j.u.c). Please correct me if Im wrong - I'm just trying to explore our options in the hope that we can start to move to TestNG soon. That's odd - Thread.class in luni-kernel (VME v4) definitely contains a getId() method. May be I did something wrong - I will check tomorrow. I do like them - in fact I was going to link his mail after that point but couldn't find it. Here is the link: http://mail-archives.apache.org/mod_mbox/incubator-harmony-dev/200607.mbox/[EMAIL PROTECTED] Thanks - it was good to reread. As far as I remember there was additions to the George's list like using @Test (groups={os.any} ) rather than simple @Test for API tests that run on any platform. Yes agreed, it is good to make group membership explicit as it facilitates inclusion/exclusion of groups, and makes it obvious which group tests belong to. Perhaps the same should be done for api tests, so we have a type.api group? I thought I had sent a mail out on this in the original thread, but I guess I never did (unless Thunderbird is hiding mail from me again!). We have had discussions on the list about platform specific native code organisation, and I think organisation of platform specific tests can be handled in a similar way. So we would have an os.shared (or os.any, Im just going with shared as that is what we use for the native code) group that would run on all platforms. If there are platform specific tests, they would be grouped by os, then architecture (and then possibly word size). So, for example, if we were on a Windows x86 32bit machine, the Ant scripts would run test groups os.shared, os.windows, os.windows.x86 and (if we ever get any 32/64bit specific tests) os.windows.x86.32bit, or something similar. I think this tallies with what George was suggesting, and makes sense to me. Are there any objections to this approach? Regards, Oliver I really mean that we should make sure that everyone is happy with a certain set of group names before going ahead and applying them. Once they are decided upon, they should be added to the testing conventions webpage. Yes, agree. With Best Regards, 2006/8/10, Oliver Deakin [EMAIL PROTECTED]: Alexei Zakharov wrote: We now have this, so let the TestNG debate continue :) Unfortunately, we still need java.util.concurrent :-( Yeah, TestNG 5.0 still throws java.lang.NoClassDefFoundError : java.util.concurrent.LinkedBlockingQueue on Harmony+j9v4. So perhaps the build system should be changed temporarily so that we dont self host the test harness? i.e. until we get java.util.concurrent, run Ant and the subsequent TestNG process with RI or other non-Harmony VM, and launch the tests with Harmony VM using the jvm option. At least it will allow us to move forward with TestNG (if that's what we decide) without having to wait for java.util.concurrent. Then when we have j.u.c, start self-hosting again. Comments? I've also got an error while trying to compile TestNG 5.0 tests with Harmony+j9v4+ecj: The method getId() is undefined for the type Thread That's odd - Thread.class in luni-kernel (VME v4) definitely contains a getId() method. I don't know anything about the TestNG tests - how are they run? is luni-kernel.jar definitely at the front of the bootclasspath? - If we go ahead with TestNG, we need to select a set of group names to use to indicate exclusion, platform specificness etc. Don't you like the names suggested by George? I do like them - in fact I was going to link his mail after that point but couldn't find it. I really mean that we should make sure that everyone is happy with a certain set of group names before going ahead and applying them. Once they are decided upon, they should be added to the testing conventions webpage. - Decide whether some physical separation of tests on disk is necessary, for instance to separate classpath and bootclasspath tests IMHO it is ok to separate classpath and bootclasspath tests because it will be easer to implement such solution technically. I agree, although I don't feel strongly about it. Regards,
Re: [testing] metadata approach
Hi Oliver, But is j.u.c actually required to be in the runtime under test? I was thinking that j.u.c was only required for the VM actually running the harness, and all that gets run on the VM under test is the actual test method. If this was true, then we could run TestNG with the RI (which has j.u.c) and use the jvm option to specify the Harmony VM (which would not need j.u.c). I afraid we cannot do like that. At least I was not successful last time I tried to run tests using the jvm=harmony option. As far as I understand TestNG requires j.u.c for running every single test method because parallel running can be specified on the method level. I mean in TestNG you can write something like this: @Test(threadPoolSize = 7, invocationCount = 29) This means that this method should be invoked from different threads. And it seems that TestNG needs j.u.c to implement multithreading. Yes agreed, it is good to make group membership explicit as it facilitates inclusion/exclusion of groups, and makes it obvious which group tests belong to. Perhaps the same should be done for api tests, so we have a type.api group? So you suggest to add @Test (groups={os.any, type.api}) to every api test (that runs on every platform) without any defaults at all? I thought I had sent a mail out on this in the original thread, but I guess I never did (unless Thunderbird is hiding mail from me again!). Just checked - there is no such mail in my gmail box, at least in the [classlib] Testing conventions - a proposal thread. So, for example, if we were on a Windows x86 32bit machine, the Ant scripts would run test groups os.shared, os.windows, os.windows.x86 and (if we ever get any 32/64bit specific tests) os.windows.x86.32bit, or something similar. Well, I like it in general. The only thing I still feel uncomfortable with is os.shared. When some code is shared among different platforms it makes sense. But here we have a test shared by several OSes. Does this sound natural? But I don't feel strongly about that and will not object if everybody likes this. With Best Regards, 2006/8/14, Oliver Deakin [EMAIL PROTECTED]: Alexei Zakharov wrote: Hi Oliver, So perhaps the build system should be changed temporarily so that we dont self host the test harness? i.e. until we get java.util.concurrent, run Ant and the subsequent TestNG process with RI or other non-Harmony VM, and launch the tests with Harmony VM using the jvm option. The bad news is that TestNG requires j.u.c stuff even for single test execution (i.e. in any case if jvm=path to harmony). :( So if we want to run annotated tests with TestNG (even from the command line) we need j.u.c. But is j.u.c actually required to be in the runtime under test? I was thinking that j.u.c was only required for the VM actually running the harness, and all that gets run on the VM under test is the actual test method. If this was true, then we could run TestNG with the RI (which has j.u.c) and use the jvm option to specify the Harmony VM (which would not need j.u.c). Please correct me if Im wrong - I'm just trying to explore our options in the hope that we can start to move to TestNG soon. That's odd - Thread.class in luni-kernel (VME v4) definitely contains a getId() method. May be I did something wrong - I will check tomorrow. I do like them - in fact I was going to link his mail after that point but couldn't find it. Here is the link: http://mail-archives.apache.org/mod_mbox/incubator-harmony-dev/200607.mbox/[EMAIL PROTECTED] Thanks - it was good to reread. As far as I remember there was additions to the George's list like using @Test (groups={os.any} ) rather than simple @Test for API tests that run on any platform. Yes agreed, it is good to make group membership explicit as it facilitates inclusion/exclusion of groups, and makes it obvious which group tests belong to. Perhaps the same should be done for api tests, so we have a type.api group? I thought I had sent a mail out on this in the original thread, but I guess I never did (unless Thunderbird is hiding mail from me again!). We have had discussions on the list about platform specific native code organisation, and I think organisation of platform specific tests can be handled in a similar way. So we would have an os.shared (or os.any, Im just going with shared as that is what we use for the native code) group that would run on all platforms. If there are platform specific tests, they would be grouped by os, then architecture (and then possibly word size). So, for example, if we were on a Windows x86 32bit machine, the Ant scripts would run test groups os.shared, os.windows, os.windows.x86 and (if we ever get any 32/64bit specific tests) os.windows.x86.32bit, or something similar. I think this tallies with what George was suggesting, and makes sense to me. Are there any objections to this approach? Regards, Oliver I really mean that we should make sure that everyone is happy with a certain
Re: [testing] metadata approach
Alexei Zakharov wrote: Hi Oliver, But is j.u.c actually required to be in the runtime under test? I was thinking that j.u.c was only required for the VM actually running the harness, and all that gets run on the VM under test is the actual test method. If this was true, then we could run TestNG with the RI (which has j.u.c) and use the jvm option to specify the Harmony VM (which would not need j.u.c). I afraid we cannot do like that. At least I was not successful last time I tried to run tests using the jvm=harmony option. As far as I understand TestNG requires j.u.c for running every single test method because parallel running can be specified on the method level. I mean in TestNG you can write something like this: @Test(threadPoolSize = 7, invocationCount = 29) This means that this method should be invoked from different threads. And it seems that TestNG needs j.u.c to implement multithreading. That's a real shame - let's hope we get j.u.c in the not too distant future then! Yes agreed, it is good to make group membership explicit as it facilitates inclusion/exclusion of groups, and makes it obvious which group tests belong to. Perhaps the same should be done for api tests, so we have a type.api group? So you suggest to add @Test (groups={os.any, type.api}) to every api test (that runs on every platform) without any defaults at all? Yes, although I dont feel strongly about it. I just think making it completely explicit avoids any confusion about which group the test is in. I thought I had sent a mail out on this in the original thread, but I guess I never did (unless Thunderbird is hiding mail from me again!). Just checked - there is no such mail in my gmail box, at least in the [classlib] Testing conventions - a proposal thread. Glad Im not going mad (well, not for that reason anyway...) So, for example, if we were on a Windows x86 32bit machine, the Ant scripts would run test groups os.shared, os.windows, os.windows.x86 and (if we ever get any 32/64bit specific tests) os.windows.x86.32bit, or something similar. Well, I like it in general. The only thing I still feel uncomfortable with is os.shared. When some code is shared among different platforms it makes sense. But here we have a test shared by several OSes. Does this sound natural? But I don't feel strongly about that and will not object if everybody likes this. I dont feel strongly either. I think having the group is good, I don't mind what it is called. Regards, Oliver With Best Regards, 2006/8/14, Oliver Deakin [EMAIL PROTECTED]: Alexei Zakharov wrote: Hi Oliver, So perhaps the build system should be changed temporarily so that we dont self host the test harness? i.e. until we get java.util.concurrent, run Ant and the subsequent TestNG process with RI or other non-Harmony VM, and launch the tests with Harmony VM using the jvm option. The bad news is that TestNG requires j.u.c stuff even for single test execution (i.e. in any case if jvm=path to harmony). :( So if we want to run annotated tests with TestNG (even from the command line) we need j.u.c. But is j.u.c actually required to be in the runtime under test? I was thinking that j.u.c was only required for the VM actually running the harness, and all that gets run on the VM under test is the actual test method. If this was true, then we could run TestNG with the RI (which has j.u.c) and use the jvm option to specify the Harmony VM (which would not need j.u.c). Please correct me if Im wrong - I'm just trying to explore our options in the hope that we can start to move to TestNG soon. That's odd - Thread.class in luni-kernel (VME v4) definitely contains a getId() method. May be I did something wrong - I will check tomorrow. I do like them - in fact I was going to link his mail after that point but couldn't find it. Here is the link: http://mail-archives.apache.org/mod_mbox/incubator-harmony-dev/200607.mbox/[EMAIL PROTECTED] Thanks - it was good to reread. As far as I remember there was additions to the George's list like using @Test (groups={os.any} ) rather than simple @Test for API tests that run on any platform. Yes agreed, it is good to make group membership explicit as it facilitates inclusion/exclusion of groups, and makes it obvious which group tests belong to. Perhaps the same should be done for api tests, so we have a type.api group? I thought I had sent a mail out on this in the original thread, but I guess I never did (unless Thunderbird is hiding mail from me again!). We have had discussions on the list about platform specific native code organisation, and I think organisation of platform specific tests can be handled in a similar way. So we would have an os.shared (or os.any, Im just going with shared as that is what we use for the native code) group that would run on all platforms. If there are platform specific tests, they would be grouped by os, then architecture (and then possibly
Re: [testing] metadata approach
We now have this, so let the TestNG debate continue :) Unfortunately, we still need java.util.concurrent :-( Yeah, TestNG 5.0 still throws java.lang.NoClassDefFoundError : java.util.concurrent.LinkedBlockingQueue on Harmony+j9v4. I've also got an error while trying to compile TestNG 5.0 tests with Harmony+j9v4+ecj: The method getId() is undefined for the type Thread - If we go ahead with TestNG, we need to select a set of group names to use to indicate exclusion, platform specificness etc. Don't you like the names suggested by George? - Decide whether some physical separation of tests on disk is necessary, for instance to separate classpath and bootclasspath tests IMHO it is ok to separate classpath and bootclasspath tests because it will be easer to implement such solution technically. Regards, 2006/8/10, Richard Liang [EMAIL PROTECTED]: Oliver Deakin wrote: Richard Liang wrote: Alexei Zakharov wrote: Hi Richard, Not sure if we really want to involve another migration: TestNG javadoc - TestNG annotation. Any comments? Well, IMHO this depends on time constraints - when do we plan to have the support for anotations? If the answer is about a couple of weeks - no problem, we can wait. But if this is several months... About the migration - I don't think this will be a real painfull migration, all infrastructure will remain the same. We will only need to convert javadocs to annotations (one-one correspondence) and this task can be easily automated. Sounds reasonable. :-) Maybe drlvm guys or Oliver could tell us when we will have a VM with annotation support? We now have this, so let the TestNG debate continue :) Unfortunately, we still need java.util.concurrent :-( I guess we need to decide a few things before we go ahead with this: - Whether TestNG is generally accepted by the Harmony community as our test harness of choice for unit testing. I think this will probably require a vote of some kind before we could make the move. - If we go ahead with TestNG, we need to select a set of group names to use to indicate exclusion, platform specificness etc. - Decide whether some physical separation of tests on disk is necessary, for instance to separate classpath and bootclasspath tests. Comments/additions? Agree. And we could provide proposals for these questions case by case, and let community make decision. Best regards, Richard Regards, Oliver Thanks, 2006/8/1, Richard Liang [EMAIL PROTECTED]: Alexei Zakharov wrote: Hi, I have created this new thread as a single place for discussions started in Re: [testing] Peace and [classlib] Testing conventions – a proposal threads. What did we have in the previous threads? * Test classification proposed by Vladimir * Test classification and group names proposed by George * Solution for Ant and TestNG scripting issues (still being discussed) Since a lot of people are asking about TestNG and wanting TestNG I decide to put some effort and take a closer look at this piece of software. Thus during the last few days I was playing with TestNG - I tried to run different kind of tests with it, to perform various workloads, generate reports in different ways and etc. The purpose of all this activity was to try TestNG in a real work, understand is TestNG really worth our credits and how expensive can be moving to TestNG from our currently implemented testing infrastructure. Now I have some thoughts and facts I'd like to share with the community. I've put it in the form of list for convenience. * TestNG works ok in normal conditions, no visible bugs * It is possible to define and use various (possibly intersecting) test groups with TestNG * TestNG-style metadata is more convenient than JUnit test suites (now I agree with this statement). IMHO this is the main TestNG benefit. * It is possible to run TestNG from command line * There is also the special ant task for running TestNG * Not everything can be configured with the ant task or command-line params, sometimes extra test suite definition file testng.xml is needed * It is possible to run jUnit tests with TestNG (testng.xml is needed in this case) * It is possible to run junit tests we currently have in Harmony with TestNG without any problems and modifications of the source code. However, we probably should write some number of TestNG test suite definition files testng.xml to be able to run all our junit tests (I have tried tests for bean module and some tests for luni) * We can mix jUnit tests and TestNG tests in the single test suite configuration – i.e. single testng.xml file. We can add TestNG metadata to some test classes and leave other test classes untouched * TestNG generates HTML reports in its own style, not a very convenient one IMHO * It is also possible to generate JUnitReports from the output generated by TestNG * Such reports will have a little
Re: [testing] metadata approach
Richard Liang wrote: Oliver Deakin wrote: Richard Liang wrote: Alexei Zakharov wrote: Hi Richard, Not sure if we really want to involve another migration: TestNG javadoc - TestNG annotation. Any comments? Well, IMHO this depends on time constraints - when do we plan to have the support for anotations? If the answer is about a couple of weeks - no problem, we can wait. But if this is several months... About the migration - I don't think this will be a real painfull migration, all infrastructure will remain the same. We will only need to convert javadocs to annotations (one-one correspondence) and this task can be easily automated. Sounds reasonable. :-) Maybe drlvm guys or Oliver could tell us when we will have a VM with annotation support? We now have this, so let the TestNG debate continue :) Unfortunately, we still need java.util.concurrent :-( Ah! I hadn't realised that that was a requirement of TestNG. I guess we need to decide a few things before we go ahead with this: - Whether TestNG is generally accepted by the Harmony community as our test harness of choice for unit testing. I think this will probably require a vote of some kind before we could make the move. - If we go ahead with TestNG, we need to select a set of group names to use to indicate exclusion, platform specificness etc. - Decide whether some physical separation of tests on disk is necessary, for instance to separate classpath and bootclasspath tests. Comments/additions? Agree. And we could provide proposals for these questions case by case, and let community make decision. Sounds good. Regards, Oliver Best regards, Richard Regards, Oliver Thanks, 2006/8/1, Richard Liang [EMAIL PROTECTED]: Alexei Zakharov wrote: Hi, I have created this new thread as a single place for discussions started in Re: [testing] Peace and [classlib] Testing conventions – a proposal threads. What did we have in the previous threads? * Test classification proposed by Vladimir * Test classification and group names proposed by George * Solution for Ant and TestNG scripting issues (still being discussed) Since a lot of people are asking about TestNG and wanting TestNG I decide to put some effort and take a closer look at this piece of software. Thus during the last few days I was playing with TestNG - I tried to run different kind of tests with it, to perform various workloads, generate reports in different ways and etc. The purpose of all this activity was to try TestNG in a real work, understand is TestNG really worth our credits and how expensive can be moving to TestNG from our currently implemented testing infrastructure. Now I have some thoughts and facts I'd like to share with the community. I've put it in the form of list for convenience. * TestNG works ok in normal conditions, no visible bugs * It is possible to define and use various (possibly intersecting) test groups with TestNG * TestNG-style metadata is more convenient than JUnit test suites (now I agree with this statement). IMHO this is the main TestNG benefit. * It is possible to run TestNG from command line * There is also the special ant task for running TestNG * Not everything can be configured with the ant task or command-line params, sometimes extra test suite definition file testng.xml is needed * It is possible to run jUnit tests with TestNG (testng.xml is needed in this case) * It is possible to run junit tests we currently have in Harmony with TestNG without any problems and modifications of the source code. However, we probably should write some number of TestNG test suite definition files testng.xml to be able to run all our junit tests (I have tried tests for bean module and some tests for luni) * We can mix jUnit tests and TestNG tests in the single test suite configuration – i.e. single testng.xml file. We can add TestNG metadata to some test classes and leave other test classes untouched * TestNG generates HTML reports in its own style, not a very convenient one IMHO * It is also possible to generate JUnitReports from the output generated by TestNG * Such reports will have a little bit different structure since TestNG doesn't provider any information about enclosing type for test methods. Names for tests (replacement for JUnit test classes) and test suites should be externally configured in testng.xml * TestNG for Java 5 doesn't work on Harmony because some necessary classes from java.util.concurrent package are missing and it seems that jsr14 target (we are currently using) doesn't support annotations * TestNG for Java 1.4 (javadoc version) currently works on Harmony * I have half-way done script that converts TestNG 1.4 metadata (javadoc) tests to TestNG 1.5 (5.0 annotations) tests. Excellent summary! Thanks a lot The question I'd like to raise now is – aren't we ready for TestNG right now? I suppose we will use Java 5.0 annotations of TestNG,
Re: [testing] metadata approach
Alexei Zakharov wrote: We now have this, so let the TestNG debate continue :) Unfortunately, we still need java.util.concurrent :-( Yeah, TestNG 5.0 still throws java.lang.NoClassDefFoundError : java.util.concurrent.LinkedBlockingQueue on Harmony+j9v4. So perhaps the build system should be changed temporarily so that we dont self host the test harness? i.e. until we get java.util.concurrent, run Ant and the subsequent TestNG process with RI or other non-Harmony VM, and launch the tests with Harmony VM using the jvm option. At least it will allow us to move forward with TestNG (if that's what we decide) without having to wait for java.util.concurrent. Then when we have j.u.c, start self-hosting again. Comments? I've also got an error while trying to compile TestNG 5.0 tests with Harmony+j9v4+ecj: The method getId() is undefined for the type Thread That's odd - Thread.class in luni-kernel (VME v4) definitely contains a getId() method. I don't know anything about the TestNG tests - how are they run? is luni-kernel.jar definitely at the front of the bootclasspath? - If we go ahead with TestNG, we need to select a set of group names to use to indicate exclusion, platform specificness etc. Don't you like the names suggested by George? I do like them - in fact I was going to link his mail after that point but couldn't find it. I really mean that we should make sure that everyone is happy with a certain set of group names before going ahead and applying them. Once they are decided upon, they should be added to the testing conventions webpage. - Decide whether some physical separation of tests on disk is necessary, for instance to separate classpath and bootclasspath tests IMHO it is ok to separate classpath and bootclasspath tests because it will be easer to implement such solution technically. I agree, although I don't feel strongly about it. Regards, Oliver Regards, 2006/8/10, Richard Liang [EMAIL PROTECTED]: Oliver Deakin wrote: Richard Liang wrote: Alexei Zakharov wrote: Hi Richard, Not sure if we really want to involve another migration: TestNG javadoc - TestNG annotation. Any comments? Well, IMHO this depends on time constraints - when do we plan to have the support for anotations? If the answer is about a couple of weeks - no problem, we can wait. But if this is several months... About the migration - I don't think this will be a real painfull migration, all infrastructure will remain the same. We will only need to convert javadocs to annotations (one-one correspondence) and this task can be easily automated. Sounds reasonable. :-) Maybe drlvm guys or Oliver could tell us when we will have a VM with annotation support? We now have this, so let the TestNG debate continue :) Unfortunately, we still need java.util.concurrent :-( I guess we need to decide a few things before we go ahead with this: - Whether TestNG is generally accepted by the Harmony community as our test harness of choice for unit testing. I think this will probably require a vote of some kind before we could make the move. - If we go ahead with TestNG, we need to select a set of group names to use to indicate exclusion, platform specificness etc. - Decide whether some physical separation of tests on disk is necessary, for instance to separate classpath and bootclasspath tests. Comments/additions? Agree. And we could provide proposals for these questions case by case, and let community make decision. Best regards, Richard Regards, Oliver Thanks, 2006/8/1, Richard Liang [EMAIL PROTECTED]: Alexei Zakharov wrote: Hi, I have created this new thread as a single place for discussions started in Re: [testing] Peace and [classlib] Testing conventions – a proposal threads. What did we have in the previous threads? * Test classification proposed by Vladimir * Test classification and group names proposed by George * Solution for Ant and TestNG scripting issues (still being discussed) Since a lot of people are asking about TestNG and wanting TestNG I decide to put some effort and take a closer look at this piece of software. Thus during the last few days I was playing with TestNG - I tried to run different kind of tests with it, to perform various workloads, generate reports in different ways and etc. The purpose of all this activity was to try TestNG in a real work, understand is TestNG really worth our credits and how expensive can be moving to TestNG from our currently implemented testing infrastructure. Now I have some thoughts and facts I'd like to share with the community. I've put it in the form of list for convenience. * TestNG works ok in normal conditions, no visible bugs * It is possible to define and use various (possibly intersecting) test groups with TestNG * TestNG-style metadata is more convenient than JUnit test suites (now I agree with this statement). IMHO this is the
Re: [testing] metadata approach
Hi Oliver, So perhaps the build system should be changed temporarily so that we dont self host the test harness? i.e. until we get java.util.concurrent, run Ant and the subsequent TestNG process with RI or other non-Harmony VM, and launch the tests with Harmony VM using the jvm option. The bad news is that TestNG requires j.u.c stuff even for single test execution (i.e. in any case if jvm=path to harmony). :( So if we want to run annotated tests with TestNG (even from the command line) we need j.u.c. That's odd - Thread.class in luni-kernel (VME v4) definitely contains a getId() method. May be I did something wrong - I will check tomorrow. I do like them - in fact I was going to link his mail after that point but couldn't find it. Here is the link: http://mail-archives.apache.org/mod_mbox/incubator-harmony-dev/200607.mbox/[EMAIL PROTECTED] As far as I remember there was additions to the George's list like using @Test (groups={os.any} ) rather than simple @Test for API tests that run on any platform. I really mean that we should make sure that everyone is happy with a certain set of group names before going ahead and applying them. Once they are decided upon, they should be added to the testing conventions webpage. Yes, agree. With Best Regards, 2006/8/10, Oliver Deakin [EMAIL PROTECTED]: Alexei Zakharov wrote: We now have this, so let the TestNG debate continue :) Unfortunately, we still need java.util.concurrent :-( Yeah, TestNG 5.0 still throws java.lang.NoClassDefFoundError : java.util.concurrent.LinkedBlockingQueue on Harmony+j9v4. So perhaps the build system should be changed temporarily so that we dont self host the test harness? i.e. until we get java.util.concurrent, run Ant and the subsequent TestNG process with RI or other non-Harmony VM, and launch the tests with Harmony VM using the jvm option. At least it will allow us to move forward with TestNG (if that's what we decide) without having to wait for java.util.concurrent. Then when we have j.u.c, start self-hosting again. Comments? I've also got an error while trying to compile TestNG 5.0 tests with Harmony+j9v4+ecj: The method getId() is undefined for the type Thread That's odd - Thread.class in luni-kernel (VME v4) definitely contains a getId() method. I don't know anything about the TestNG tests - how are they run? is luni-kernel.jar definitely at the front of the bootclasspath? - If we go ahead with TestNG, we need to select a set of group names to use to indicate exclusion, platform specificness etc. Don't you like the names suggested by George? I do like them - in fact I was going to link his mail after that point but couldn't find it. I really mean that we should make sure that everyone is happy with a certain set of group names before going ahead and applying them. Once they are decided upon, they should be added to the testing conventions webpage. - Decide whether some physical separation of tests on disk is necessary, for instance to separate classpath and bootclasspath tests IMHO it is ok to separate classpath and bootclasspath tests because it will be easer to implement such solution technically. I agree, although I don't feel strongly about it. Regards, Oliver Regards, 2006/8/10, Richard Liang [EMAIL PROTECTED]: Oliver Deakin wrote: Richard Liang wrote: Alexei Zakharov wrote: Hi Richard, Not sure if we really want to involve another migration: TestNG javadoc - TestNG annotation. Any comments? Well, IMHO this depends on time constraints - when do we plan to have the support for anotations? If the answer is about a couple of weeks - no problem, we can wait. But if this is several months... About the migration - I don't think this will be a real painfull migration, all infrastructure will remain the same. We will only need to convert javadocs to annotations (one-one correspondence) and this task can be easily automated. Sounds reasonable. :-) Maybe drlvm guys or Oliver could tell us when we will have a VM with annotation support? We now have this, so let the TestNG debate continue :) Unfortunately, we still need java.util.concurrent :-( I guess we need to decide a few things before we go ahead with this: - Whether TestNG is generally accepted by the Harmony community as our test harness of choice for unit testing. I think this will probably require a vote of some kind before we could make the move. - If we go ahead with TestNG, we need to select a set of group names to use to indicate exclusion, platform specificness etc. - Decide whether some physical separation of tests on disk is necessary, for instance to separate classpath and bootclasspath tests. Comments/additions? Agree. And we could provide proposals for these questions case by case, and let community make decision. Best regards, Richard Regards, Oliver Thanks, 2006/8/1, Richard Liang [EMAIL PROTECTED]:
Re: [testing] metadata approach
Hi Alex, some ways we can achieve this. Do we want to finish deciding that before the migration, or are we confident that we will get to a point where a decision is made and we can start transitioning beforehand? Well, IMHO adding some extra javadoc tag will not break anything. There are situations when you need some sticker just to put on test to mark it as implementation specific for example since you will forget everything in a week. I currently use some javadoc text like This test is highly implementation specific or @impl tag to mark such tests. But I would prefer to do it in a more standardized way. ABout finishing deciding: we probably need to engage more people in the discussion ... Lastly, do we have entries (e.g. on the wiki) about how to write new tests that are either (a) compatible with JUnit+TestNG, or (b) use TestNG alone? It seems like this would be a good way to ensure new tests are TestNG-compatible and thus increase the coverage of TestNG tests. I don't think we have anything concerning TestNG on the wiki right now. IMHO we still need everybody to accept the intention of moving to the new harness before starting this activities. We probably also need to have pointers at least for how to run the tests from My Favourite IDE (tm) and/or the build itself. AFAIK there are TestNG plugins for Eclipse and IDEA but I haven't tried it yet. Are there any other systems e.g. JUnitReport that we need to consider for this? Does TestNG's reporting suit what we want to do and/or can we leverage any of the reporting that it does on the web? Basically TestNG built-in reporting system produces all necessary information. But the resulting report doesn't look very nice and I don't know how to customize it. Funny thing: the report looks nicer in Mozilla than in IE. On the other hand, the use of JUnitReport is officially approved by TestNG team - i.e. TestNG produces all necessary input stuff for JUnitReport. JR is more powerfull report-generation system, we may customize it's output with XSL sheets if we like (AFAIK). The default style of reports generated by JUnitReport looks nicer. Regards, 2006/7/28, Alex Blewitt [EMAIL PROTECTED]: The question I'd like to raise now is – aren't we ready for TestNG right now? For example, we could replace our harness from jUnit to TestNG and lazily start converting of some failing and platform dependent tests to javadoc version of TestNG. Thought? Suggestions? Opposite opinions? I think that if the decision is made to go down the TestNG route (and my hope is that we will) then this sounds like a good approach. Of course, everyone would have to be happy at the migration (sounds like we're heading towards a vote on it) and like you say, we can always use the TestNG harness to run the existing set of JUnit tests, so we should still be in the same position. As for the metadata decisions (e.g. platforms) there still seems to be some ways we can achieve this. Do we want to finish deciding that before the migration, or are we confident that we will get to a point where a decision is made and we can start transitioning beforehand? Lastly, do we have entries (e.g. on the wiki) about how to write new tests that are either (a) compatible with JUnit+TestNG, or (b) use TestNG alone? It seems like this would be a good way to ensure new tests are TestNG-compatible and thus increase the coverage of TestNG tests. We probably also need to have pointers at least for how to run the tests from My Favourite IDE (tm) and/or the build itself. Are there any other systems e.g. JUnitReport that we need to consider for this? Does TestNG's reporting suit what we want to do and/or can we leverage any of the reporting that it does on the web? Alex. -- Alexei Zakharov, Intel Middleware Product Division - Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [testing] metadata approach
Alexei Zakharov wrote: Hi, I have created this new thread as a single place for discussions started in Re: [testing] Peace and [classlib] Testing conventions – a proposal threads. What did we have in the previous threads? * Test classification proposed by Vladimir * Test classification and group names proposed by George * Solution for Ant and TestNG scripting issues (still being discussed) Since a lot of people are asking about TestNG and wanting TestNG I decide to put some effort and take a closer look at this piece of software. Thus during the last few days I was playing with TestNG - I tried to run different kind of tests with it, to perform various workloads, generate reports in different ways and etc. The purpose of all this activity was to try TestNG in a real work, understand is TestNG really worth our credits and how expensive can be moving to TestNG from our currently implemented testing infrastructure. Now I have some thoughts and facts I'd like to share with the community. I've put it in the form of list for convenience. * TestNG works ok in normal conditions, no visible bugs * It is possible to define and use various (possibly intersecting) test groups with TestNG * TestNG-style metadata is more convenient than JUnit test suites (now I agree with this statement). IMHO this is the main TestNG benefit. * It is possible to run TestNG from command line * There is also the special ant task for running TestNG * Not everything can be configured with the ant task or command-line params, sometimes extra test suite definition file testng.xml is needed * It is possible to run jUnit tests with TestNG (testng.xml is needed in this case) * It is possible to run junit tests we currently have in Harmony with TestNG without any problems and modifications of the source code. However, we probably should write some number of TestNG test suite definition files testng.xml to be able to run all our junit tests (I have tried tests for bean module and some tests for luni) * We can mix jUnit tests and TestNG tests in the single test suite configuration – i.e. single testng.xml file. We can add TestNG metadata to some test classes and leave other test classes untouched * TestNG generates HTML reports in its own style, not a very convenient one IMHO * It is also possible to generate JUnitReports from the output generated by TestNG * Such reports will have a little bit different structure since TestNG doesn't provider any information about enclosing type for test methods. Names for tests (replacement for JUnit test classes) and test suites should be externally configured in testng.xml * TestNG for Java 5 doesn't work on Harmony because some necessary classes from java.util.concurrent package are missing and it seems that jsr14 target (we are currently using) doesn't support annotations * TestNG for Java 1.4 (javadoc version) currently works on Harmony * I have half-way done script that converts TestNG 1.4 metadata (javadoc) tests to TestNG 1.5 (5.0 annotations) tests. Excellent summary! Thanks a lot The question I'd like to raise now is – aren't we ready for TestNG right now? I suppose we will use Java 5.0 annotations of TestNG, so it seems now we are not ready for TestNG. But we can continue our feasibility study, just like what you have done, to know if TestNG really meets our requirements or if there are any potential problems. Maybe we could list a prerequisite list. e.g, 1) Harmony can fully self-host TestNG with Java5 annotations 2) Test groups are well-defined and agreed in community 3) Guidelines to write TestNG testcases 4) Take one module to run a pilot case Please correct me if I'm wrong For example, we could replace our harness from jUnit to TestNG and lazily start converting of some failing and platform dependent tests to javadoc version of TestNG. The rest of the tests will remain jUnit in fact. And when our VM will be ready to handle annotations we can convert all our TestNG 1.4 tests to TestNG 1.5. I understand that this idea may seem to be too early. But anyway we will need to change things some day since many people are unhappy with the current testing infrastructure (me for example). Not sure if we really want to involve another migration: TestNG javadoc - TestNG annotation. Any comments? Thought? Suggestions? Opposite opinions? With Best Regards, -- Richard Liang China Software Development Lab, IBM - Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [testing] metadata approach
Hi Richard, Not sure if we really want to involve another migration: TestNG javadoc - TestNG annotation. Any comments? Well, IMHO this depends on time constraints - when do we plan to have the support for anotations? If the answer is about a couple of weeks - no problem, we can wait. But if this is several months... About the migration - I don't think this will be a real painfull migration, all infrastructure will remain the same. We will only need to convert javadocs to annotations (one-one correspondence) and this task can be easily automated. Thanks, 2006/8/1, Richard Liang [EMAIL PROTECTED]: Alexei Zakharov wrote: Hi, I have created this new thread as a single place for discussions started in Re: [testing] Peace and [classlib] Testing conventions – a proposal threads. What did we have in the previous threads? * Test classification proposed by Vladimir * Test classification and group names proposed by George * Solution for Ant and TestNG scripting issues (still being discussed) Since a lot of people are asking about TestNG and wanting TestNG I decide to put some effort and take a closer look at this piece of software. Thus during the last few days I was playing with TestNG - I tried to run different kind of tests with it, to perform various workloads, generate reports in different ways and etc. The purpose of all this activity was to try TestNG in a real work, understand is TestNG really worth our credits and how expensive can be moving to TestNG from our currently implemented testing infrastructure. Now I have some thoughts and facts I'd like to share with the community. I've put it in the form of list for convenience. * TestNG works ok in normal conditions, no visible bugs * It is possible to define and use various (possibly intersecting) test groups with TestNG * TestNG-style metadata is more convenient than JUnit test suites (now I agree with this statement). IMHO this is the main TestNG benefit. * It is possible to run TestNG from command line * There is also the special ant task for running TestNG * Not everything can be configured with the ant task or command-line params, sometimes extra test suite definition file testng.xml is needed * It is possible to run jUnit tests with TestNG (testng.xml is needed in this case) * It is possible to run junit tests we currently have in Harmony with TestNG without any problems and modifications of the source code. However, we probably should write some number of TestNG test suite definition files testng.xml to be able to run all our junit tests (I have tried tests for bean module and some tests for luni) * We can mix jUnit tests and TestNG tests in the single test suite configuration – i.e. single testng.xml file. We can add TestNG metadata to some test classes and leave other test classes untouched * TestNG generates HTML reports in its own style, not a very convenient one IMHO * It is also possible to generate JUnitReports from the output generated by TestNG * Such reports will have a little bit different structure since TestNG doesn't provider any information about enclosing type for test methods. Names for tests (replacement for JUnit test classes) and test suites should be externally configured in testng.xml * TestNG for Java 5 doesn't work on Harmony because some necessary classes from java.util.concurrent package are missing and it seems that jsr14 target (we are currently using) doesn't support annotations * TestNG for Java 1.4 (javadoc version) currently works on Harmony * I have half-way done script that converts TestNG 1.4 metadata (javadoc) tests to TestNG 1.5 (5.0 annotations) tests. Excellent summary! Thanks a lot The question I'd like to raise now is – aren't we ready for TestNG right now? I suppose we will use Java 5.0 annotations of TestNG, so it seems now we are not ready for TestNG. But we can continue our feasibility study, just like what you have done, to know if TestNG really meets our requirements or if there are any potential problems. Maybe we could list a prerequisite list. e.g, 1) Harmony can fully self-host TestNG with Java5 annotations 2) Test groups are well-defined and agreed in community 3) Guidelines to write TestNG testcases 4) Take one module to run a pilot case Please correct me if I'm wrong For example, we could replace our harness from jUnit to TestNG and lazily start converting of some failing and platform dependent tests to javadoc version of TestNG. The rest of the tests will remain jUnit in fact. And when our VM will be ready to handle annotations we can convert all our TestNG 1.4 tests to TestNG 1.5. I understand that this idea may seem to be too early. But anyway we will need to change things some day since many people are unhappy with the current testing infrastructure (me for example). Not sure if we really want to involve another migration: TestNG javadoc - TestNG annotation. Any comments? Thought? Suggestions? Opposite
Re: [testing] metadata approach
Alexei Zakharov wrote: Hi Richard, Not sure if we really want to involve another migration: TestNG javadoc - TestNG annotation. Any comments? Well, IMHO this depends on time constraints - when do we plan to have the support for anotations? If the answer is about a couple of weeks - no problem, we can wait. But if this is several months... About the migration - I don't think this will be a real painfull migration, all infrastructure will remain the same. We will only need to convert javadocs to annotations (one-one correspondence) and this task can be easily automated. Sounds reasonable. :-) Maybe drlvm guys or Oliver could tell us when we will have a VM with annotation support? Thanks, 2006/8/1, Richard Liang [EMAIL PROTECTED]: Alexei Zakharov wrote: Hi, I have created this new thread as a single place for discussions started in Re: [testing] Peace and [classlib] Testing conventions – a proposal threads. What did we have in the previous threads? * Test classification proposed by Vladimir * Test classification and group names proposed by George * Solution for Ant and TestNG scripting issues (still being discussed) Since a lot of people are asking about TestNG and wanting TestNG I decide to put some effort and take a closer look at this piece of software. Thus during the last few days I was playing with TestNG - I tried to run different kind of tests with it, to perform various workloads, generate reports in different ways and etc. The purpose of all this activity was to try TestNG in a real work, understand is TestNG really worth our credits and how expensive can be moving to TestNG from our currently implemented testing infrastructure. Now I have some thoughts and facts I'd like to share with the community. I've put it in the form of list for convenience. * TestNG works ok in normal conditions, no visible bugs * It is possible to define and use various (possibly intersecting) test groups with TestNG * TestNG-style metadata is more convenient than JUnit test suites (now I agree with this statement). IMHO this is the main TestNG benefit. * It is possible to run TestNG from command line * There is also the special ant task for running TestNG * Not everything can be configured with the ant task or command-line params, sometimes extra test suite definition file testng.xml is needed * It is possible to run jUnit tests with TestNG (testng.xml is needed in this case) * It is possible to run junit tests we currently have in Harmony with TestNG without any problems and modifications of the source code. However, we probably should write some number of TestNG test suite definition files testng.xml to be able to run all our junit tests (I have tried tests for bean module and some tests for luni) * We can mix jUnit tests and TestNG tests in the single test suite configuration – i.e. single testng.xml file. We can add TestNG metadata to some test classes and leave other test classes untouched * TestNG generates HTML reports in its own style, not a very convenient one IMHO * It is also possible to generate JUnitReports from the output generated by TestNG * Such reports will have a little bit different structure since TestNG doesn't provider any information about enclosing type for test methods. Names for tests (replacement for JUnit test classes) and test suites should be externally configured in testng.xml * TestNG for Java 5 doesn't work on Harmony because some necessary classes from java.util.concurrent package are missing and it seems that jsr14 target (we are currently using) doesn't support annotations * TestNG for Java 1.4 (javadoc version) currently works on Harmony * I have half-way done script that converts TestNG 1.4 metadata (javadoc) tests to TestNG 1.5 (5.0 annotations) tests. Excellent summary! Thanks a lot The question I'd like to raise now is – aren't we ready for TestNG right now? I suppose we will use Java 5.0 annotations of TestNG, so it seems now we are not ready for TestNG. But we can continue our feasibility study, just like what you have done, to know if TestNG really meets our requirements or if there are any potential problems. Maybe we could list a prerequisite list. e.g, 1) Harmony can fully self-host TestNG with Java5 annotations 2) Test groups are well-defined and agreed in community 3) Guidelines to write TestNG testcases 4) Take one module to run a pilot case Please correct me if I'm wrong For example, we could replace our harness from jUnit to TestNG and lazily start converting of some failing and platform dependent tests to javadoc version of TestNG. The rest of the tests will remain jUnit in fact. And when our VM will be ready to handle annotations we can convert all our TestNG 1.4 tests to TestNG 1.5. I understand that this idea may seem to be too early. But anyway we will need to change things some day since many people are unhappy with the current testing infrastructure (me for example). Not
[testing] metadata approach
Hi, I have created this new thread as a single place for discussions started in Re: [testing] Peace and [classlib] Testing conventions – a proposal threads. What did we have in the previous threads? * Test classification proposed by Vladimir * Test classification and group names proposed by George * Solution for Ant and TestNG scripting issues (still being discussed) Since a lot of people are asking about TestNG and wanting TestNG I decide to put some effort and take a closer look at this piece of software. Thus during the last few days I was playing with TestNG - I tried to run different kind of tests with it, to perform various workloads, generate reports in different ways and etc. The purpose of all this activity was to try TestNG in a real work, understand is TestNG really worth our credits and how expensive can be moving to TestNG from our currently implemented testing infrastructure. Now I have some thoughts and facts I'd like to share with the community. I've put it in the form of list for convenience. * TestNG works ok in normal conditions, no visible bugs * It is possible to define and use various (possibly intersecting) test groups with TestNG * TestNG-style metadata is more convenient than JUnit test suites (now I agree with this statement). IMHO this is the main TestNG benefit. * It is possible to run TestNG from command line * There is also the special ant task for running TestNG * Not everything can be configured with the ant task or command-line params, sometimes extra test suite definition file testng.xml is needed * It is possible to run jUnit tests with TestNG (testng.xml is needed in this case) * It is possible to run junit tests we currently have in Harmony with TestNG without any problems and modifications of the source code. However, we probably should write some number of TestNG test suite definition files testng.xml to be able to run all our junit tests (I have tried tests for bean module and some tests for luni) * We can mix jUnit tests and TestNG tests in the single test suite configuration – i.e. single testng.xml file. We can add TestNG metadata to some test classes and leave other test classes untouched * TestNG generates HTML reports in its own style, not a very convenient one IMHO * It is also possible to generate JUnitReports from the output generated by TestNG * Such reports will have a little bit different structure since TestNG doesn't provider any information about enclosing type for test methods. Names for tests (replacement for JUnit test classes) and test suites should be externally configured in testng.xml * TestNG for Java 5 doesn't work on Harmony because some necessary classes from java.util.concurrent package are missing and it seems that jsr14 target (we are currently using) doesn't support annotations * TestNG for Java 1.4 (javadoc version) currently works on Harmony * I have half-way done script that converts TestNG 1.4 metadata (javadoc) tests to TestNG 1.5 (5.0 annotations) tests. The question I'd like to raise now is – aren't we ready for TestNG right now? For example, we could replace our harness from jUnit to TestNG and lazily start converting of some failing and platform dependent tests to javadoc version of TestNG. The rest of the tests will remain jUnit in fact. And when our VM will be ready to handle annotations we can convert all our TestNG 1.4 tests to TestNG 1.5. I understand that this idea may seem to be too early. But anyway we will need to change things some day since many people are unhappy with the current testing infrastructure (me for example). Thought? Suggestions? Opposite opinions? With Best Regards, -- Alexei Zakharov, Intel Middleware Product Division - Terms of use : http://incubator.apache.org/harmony/mailing.html To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]
Re: [testing] metadata approach
The question I'd like to raise now is – aren't we ready for TestNG right now? For example, we could replace our harness from jUnit to TestNG and lazily start converting of some failing and platform dependent tests to javadoc version of TestNG. Thought? Suggestions? Opposite opinions? I think that if the decision is made to go down the TestNG route (and my hope is that we will) then this sounds like a good approach. Of course, everyone would have to be happy at the migration (sounds like we're heading towards a vote on it) and like you say, we can always use the TestNG harness to run the existing set of JUnit tests, so we should still be in the same position. As for the metadata decisions (e.g. platforms) there still seems to be some ways we can achieve this. Do we want to finish deciding that before the migration, or are we confident that we will get to a point where a decision is made and we can start transitioning beforehand? Lastly, do we have entries (e.g. on the wiki) about how to write new tests that are either (a) compatible with JUnit+TestNG, or (b) use TestNG alone? It seems like this would be a good way to ensure new tests are TestNG-compatible and thus increase the coverage of TestNG tests. We probably also need to have pointers at least for how to run the tests from My Favourite IDE (tm) and/or the build itself. Are there any other systems e.g. JUnitReport that we need to consider for this? Does TestNG's reporting suit what we want to do and/or can we leverage any of the reporting that it does on the web? Alex.