Ethan,
Thanks for the tip, nice way to achieve multi-scenarios for single
mpi_details section.


Mike.

On Mon, Nov 3, 2008 at 5:45 PM, Ethan Mallove <ethan.mall...@sun.com> wrote:

> On Mon, Nov/03/2008 09:34:07AM, Mike Dubman wrote:
> >    Hello Guys,
> >
> >    Please suggest the proper way to handle the following:
> >
> >    Is there any way to run "test run" section with a list
> >    of "mpi_details" sections?
>
> Mike,
>
> There is currently no way to iterate over multiple
> mpi_details sections, but there might be an acceptable
> workaround. You can create a simple wrapper script to
> iterate over variations of your MPI details section using
> command line INI file overrides (see
> https://svn.open-mpi.org/trac/mtt/wiki/INIOverrides). E.g.,
> say you have the following MPI details section:
>
>  [MPI details: Open MPI]
>  foo = some default value
>  bar = some default value
>  exec = mpirun @foo@ @bar@ ...
>
> Using command-line INI overrides, you can iterate over a
> series of values for "foo" and/or "bar":
>
>  $ client/mtt --scratch /some/dir ...
>  $ client/mtt --scratch /some/dir --test-run foo=abc ...
>  $ client/mtt --scratch /some/dir --test-run foo=def ...
>  $ client/mtt --scratch /some/dir --test-run bar=uvw ...
>  $ client/mtt --scratch /some/dir --test-run bar=xyz ...
>  ...
>
> Note in the above example, we use the same scratch directory
> for all the runs, and we run only the test run phase (via
> the --test-run option) since we do not need to reinstall or
> rebuild anything as we iterate over different command lines.
>
> Could the above be of use for what you're trying to do?
>
> -Ethan
>
>
> >
> >    Or how to execute specific "Test run" section against
> >    specific "mpi_details" section, where "mpi_details" can
> >    have many different scenarios of command line
> >    parameters (i.e. single mpi_details should be executed
> >    a number of times equal to the number of available
> >    scenarios for this mpi_details)? Is that possible? (it
> >    is similar to the @np param treatment available inside
> >    mpi_details section)
> >
> >    Thanks
> >
> >    Mike.
>
> > _______________________________________________
> > mtt-devel mailing list
> > mtt-de...@open-mpi.org
> > http://www.open-mpi.org/mailman/listinfo.cgi/mtt-devel
>
> _______________________________________________
> mtt-devel mailing list
> mtt-de...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/mtt-devel
>

Reply via email to