happen in the stationary body
simulation though.
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20061207/138dbc7f/attachment.htm>
nfigure
> > > > > > framework =
> > > > > > config.framework.Framework(sys.argv[1:]+['--configModules=
> > > > > PETSc.Configure','--optionsModule=PETSc.compilerOptions'],
> > > > > > loadArgDB = 0)
> > > > > > File
> > > > > > "/nas/hpctmp/g0306332/petsc-2.3.2-p7
> > > > > /python/BuildSystem/config/framework.py",
> > > > > > line 81, in __init__
> > > > > > self.argDB['debugSections'] = ['screen']
> > > > > > File "/nas/hpctmp/g0306332/petsc-2.3.2-p7
> > > > > /python/BuildSystem/RDict.py",
> > > > > > line 219, in __setitem__
> > > > > > self.save()
> > > > > > File "/nas/hpctmp/g0306332/petsc-2.3.2-p7
> > > > > /python/BuildSystem/RDict.py",
> > > > > > line 639, in save
> > > > > > self.saveTimer.start()
> > > > > > File "/usr/lib/python2.2/threading.py", line 396, in
> > > > > > start
> > > > > > _start_new_thread(self.__bootstrap, ())
> > > > > >
> > > > > >
> > > > > > May I know what's wrong? I also can't find any configure.logfile.
> > > > > >
> > > > > > Thank you.
> > > > > >
> > > > > > Regards
> > > > >
> > > > >
> > > > > --
> > > > > "Failure has a thousand explanations. Success doesn't need one" --
> Sir
> > > > > Alec Guiness
> > > > >
> > > > >
> > > >
> > >
> > >
> >
>
>
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20061207/fe4bd419/attachment.htm>
Please send all the output from running make to petsc-maint at mcs.anl.gov
Barry
On Thu, 7 Dec 2006, Saikrishna V. Marella wrote:
> Barry,
>
>
>
> I tried using -with-precision=matsingle but it gives the following error.
>
>
>
> Cannot convert PetscScalar* to MatScalar* in assignme
,
Huntsville AL 35824.
CFD Research Corporation
Tel:256-726-4954 Fax:(4806)
-- next part --
An HTML attachment was scrubbed...
URL:
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20061207/128f88a9/attachment.htm>
-- next part --
A no
David,
Depending on what you wish to do it may be very easy. The logging
is all done on a per process basis (each process just logs its stuff).
PetscLogPrintSummary() takes as an argument a communicator and summarizes
all the data over THAT communicator. So it may be as simple as calling
Pets
I would like to create two communicator subgroups of PETSC_COMM_WORLD.
Is it possible to use the petsc profiling utilities to profile the two
communicator sub-groups individually?
thank you,
David Fuentes
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Hi,
I'm trying to plot using X a sparse matrix of order 100.000, with
about 7.000.000 not zero elements using
ierr = PetscViewerDrawOpen(PETSC_COMM_WORLD, NULL, "matrix ATA",
PETSC_DECIDE, PETSC_DECIDE, PETSC_DECIDE, PETSC_DECIDE, &atav
On 12/6/06, Barry Smith wrote:
>
> This is because the default when you do NOT provide iy (you have PETSC_NULL)
> is not what you expect. You need instead to explicitly provide the iy, in
> this case
> since you want to put them in the first slots of w, just pass the wv2w again
> as the iy.
>
>
The code is not scalable for large matrices; I would argue that
it doesn't make sense to visualize such large matrices in this way.
Each matrix entry is tiny compared to a single pixal on the screen.
For PDE problems and most others, the non-zero structure of the
matrix is well represented wit
Yes, the implementation is very bad. It assembles the whole matrix on process 0
before viewing. It is intended for very small matrices only. Fixing would entail
rewriting the X viewer to take pieces of the matrix at a time. This is
a low priority for
us now, but we would gladly accept contributed c
On Thu, 7 Dec 2006, Satish Balay wrote:
> I'll sugest using a different machine for install.
Can you try the following change to python/BuildSystem/RDict.py - and
then run configure with the option '--useThreads=0' - and see if it
works?
Satish
d
> > File "/nas/home/enduser/g0306332/python-2.4/lib/python2.5/threading.py",
> > line 434, in start
> >_start_new_thread(self.__bootstrap, ())
No - its using the python - installed at
/nas/home/enduser/g0306332/python-2.4
What version of redhat? What do you have for:
cat /etc/redhat-release
Ben,
First make sure that the residual has actually gotten
to the tolerance you want. PETSc KSP does NOT stop if the
linear system has not converged; you should call KSPGetConvergedReason()
after each solve to make sure it has converged (as a quick check
you can run with -ksp_converged_reaso
No, this sounds like a bug in the code. You probably are overwriting
memory somewhere.
Matt
On 12/7/06, Ben Tay wrote:
> Hi,
>
> I have been using a few different Krylov linear solver package such as
> nspcg,sparsekit and now petsc to solve the linear eqns for my NS solver
> momentum and poiss
Jianing,
There is no "by hand" in PETSc! All you do is acess the next
set of grid points to via the indices in the usual way. So
in two dimensions to access the values to the "left of x[j][0]
use x[j][-1] to the right of x[j][nx-1] use x[j][nx]. (Recall the
i,j indices are reversed in the C a
Thanks for your reply.
> > 2) If I use the periodic type for DA, together with SNES to
> solve a
> > system, when I implement the FormFunctionLocal, I need to
> > explicitly implement the boundary conditions? Or does DA
> periodic
> > type has some built-in functionality?
>
> I am not sure what b
16 matches
Mail list logo