Hello Chris and others, I am attaching the reply to BSP thread, because it fits more there.
On Thursday 28 of August 2014 07:21:02 Chris Johns wrote: > The number of BSPs building for ARM has exploded and for just the ARM > architecture there are now 27,417 tests built. If I could run each test > in 20 seconds it would take over 6 days to do this. If I could run 6 > tests in parallel it would still take 24 hours. > > I wonder how many of these variants have had all the tests run on them ? Some though based on my run about 60 tests on TMS570 internal RAM variant (very high success rate) and smaller amount on SDRAM variant where are still quite common problems (?? loader, chip errata for unaligned multi registers loads, our BSP setup mistake ???). I have written script to load one test after another and then left user to reply if test was sucesfull or not. https://github.com/AoLaD/rtems-tms570-utils/blob/master/openocd/openocd-flat-load-test The scripts ensures that executable is load as one continuous binary, because JTAG access to external SDRAM does not work for bytes and 16-bit shorts on TMS570 but that is minor technical detail. I expect that I return from prolonged weekend on Wendesday, I try to rewrite test script in Python. But there is significant problem for automation, the result printed on serial port has to be inspected by human. I can catch TEST START and END markers (no problem) but some tests (Exceptions one) even continue with significant part after END TEST marker. Other more important problem is, that there is not common format to check for success or failure or at least I am not aware of it. There are or have been *.scn files, but even significant differences in timing and even printed sequences are expected for many tests. It would worth to discuss some change to the format. If test starts and then ends, it is quite probable that target is OK and can collect if all subsequent tests gone OK. So it would be great if there is printed some status code after exit marker which cab be processed automatically i.e. *** END MARKER TEST NAME *** RESULT: OK *** *** END MARKER TEST NAME *** RESULT: 345 *** This can be captured and report build automatically. The full tests output should be archived, so human inspection for failed ones is possible. We have some experience with continuous code base testing. Some work related to SESAMO project http://sesamo-project.eu/ some huge matrix testing for Volkswagen http://rtime.felk.cvut.cz/can/benchmark/3.0/ and for our previous CAN related work http://rtime.felk.cvut.cz/can/benchmark/1/ Mainly developed by Michal Sojka. He has even prepared Linux kernel CAN continuous code base testing but that waits for some polishment to be publically announced. If there is some way to check tests results automatically, I or Premek try prepare some setup for TMS570 RTEMS BSP testing as I have some time. By the way, the testing will be even possible for flash (but only seldom because of wearing - 1000 erase cycles declared) because TMS570 Flash support is matturing and targetting OpenOCD mainline http://thread.gmane.org/gmane.comp.debugging.openocd.devel/25458 So we do not need any Ti tools for board setup now and when setup/loader code is implemented, we do not need CSS and HalCoGen even for initial MCU setup code build. That binary can be flashed by OpenOCD already if you do not fear much about bugs and possible target smashing. Best wishes, Pavel _______________________________________________ devel mailing list devel@rtems.org http://lists.rtems.org/mailman/listinfo/devel