On Tue, Apr 29, 2008 at 8:54 AM, <Amit.Itagi at seagate.com> wrote: > Hi, > > I spent some more time understanding DA's, and how DA's should serve my > purpose. Since in the time domain calculation, I will have to scatter from > the global vector to the local vector and vice-versa at every iteration > step, I have some follow-up questions. > > 1) Does the scattering involve copying the part stored on the local node as > well (i.e. part of the local vector other than the ghost values), or is the > local part just accessed by reference ? In the first scenario, this would
No, you get a separate local vector since we reorder to give contiguous access. > involve allocating twice the storage for the local part. Also, does the Yes, however unless you run an explicit code at the limit of memory, this really does not matter. > scattering of the local part give a big hit in terms of CPU time ? Not for these cartesian topologies with small overlap. This is easy to prove. > 2) In the manual, it says "In most cases, several different vectors can > share the same communication information (or, in other words, can share a > given DA)" and "PETSc currently provides no container for multiple arrays > sharing the same distributed array communication; note, however, that the > dof parameter handles many cases of interest". I am a bit confused. Suppose > I have two arrays having the same layout on the regular grid, can I store > the first array data on one vector, and the second array data on the second > vector (and have a DA with dof=1, instead of a DA with dof=2), and be able > to scatter and update the first vector without scattering/updating the > second vector ? Yes. You call DAGetGlobalVector() twice, and then when you want one vector updated, call DALocalToGlobal() or DAGlobalToLocal() with that vector. Matt > Thanks > > Rgds, > Amit -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
