William A. Hoffman | 15 Dec 18:27 2003
Picon

RE: Future release strategies (was Re: latest director code changes)

I don't think parallel processing is what is needed in this case.
Our experience with testing is that you want to increase code coverage
as much as possible and make the test suite run as fast as possible.
If mere mortals are not able to run the test suite before checking in
code, then the system will always be broken.   

The dashboards help for checking problems on platforms that the 
developer does not have, but the tests should always be run 
and pass on the machine where the checkin is being done.   For this 
to happen, it has to be easy and fast for developers to run the tests.   The current 
testing setup in swig builds many shared libraries.   A consolidation 
of the tests into building fewer shared modules will go a long way in helping 
to speed up the testing of SWIG.  In decreasing the number of tests,
it is always a good idea to make sure code coverage is not lost.

I have done a coverage test for swig that can be seen here:

http://public.kitware.com/Public/Sites/ringworld.kitwarein.com/Linux-c++/20031215-1700-Experimental/CoverageByName.html

This was swig run with perl testing only.   However, you can see that the perl
testing is not covering the perl module completely:

http://public.kitware.com/Public/Sites/ringworld.kitwarein.com/Linux-c++/Coverage/__Source_Modules_perl5_cxx.html

I have been able to get the cmake test runs down to about the same time or even
a little less than the current testing system.  

-Bill

At 09:25 AM 12/10/2003, David Fletcher wrote:

>          It sounds like the folks from kitware (CMake's authors) are
>          willing to help with this.  Check out their web page - they
>          seem to know a thing or two about parallel processing, and
>          it sounds like they have machine and people resources to
>          devote to this problem.


Gmane