Hi Michael, hi all!
I would like to support your statement:
This is the answer to my question to Wolfgang B. some weeks ago.
I was running a deal.ii program without any explicit calls for threads
but using UMFPACK for solving the linear equations.
However, I observed that my program spawned threads.
Now, it seems clear that umfpack is responsible for the threads, but
somehow annoying.
Cheers,
Thomas
On Fri, 28 May 2010, Michael Rapson wrote:
Hi all,
I wanted to follow up my previous message about setting the number of
threads used by deal.II with some new information.
In my previous message I said that I had found out that you can
control the number of threads that TBB uses by calling
task_scheduler_init before anything else in the code, and specifying
the number of threads to use. The code snippet is:
#include <tbb/task_scheduler_init.h>
using namespace tbb;
And then call the following method early in your main loop:
task_scheduler_init init(n_desired_threads + 1);
This was more successful than I realized and when I went through my
program with gdb I found that the correct number of threads were used
until the program reached a call to SparseDirectUMFPACK (the
initialize routine I believe), where a large number of threads were
spawned. When I changed my solver to deal.II's CG solver the program
executed using the desired number of threads.
I find this a little surprising because as far as I know UMFPACK
itself is not multithreaded. My rough understanding of what is
happening is that when UMFPACK is called it is somehow outside of the
scope of my original program and hence my task_scheduler_init call. If
something not directly related to UMFPACK made a call to
task_scheduler_init outside of the scope of my original call TBB would
determine a good number of threads to spawn and hence cause the
observed behavior.
So for the moment I am not using UMFPACK, which is a pity because it
was a very efficient solver for my system, but the method outlined
above seems like a usable way to control the number of threads in most
cases.
Cheers,
Michael
_______________________________________________
dealii mailing list http://poisson.dealii.org/mailman/listinfo/dealii
_______________________________________________
dealii mailing list http://poisson.dealii.org/mailman/listinfo/dealii