The redirection is a problem with sudo - so the following
is the correction
echo 127.0.0.1 `hostname` | sudo tee -a /etc/hosts
Satish
On Thu, 15 Jun 2017, Satish Balay wrote:
> The command below should work on osx aswell..
>
> Satish
>
> On Thu, 15 Jun 2017, Mohamadreza Soltanian wrote:
>
>
On 14/06/17 07:45, Jed Brown wrote:
Barry Smith writes:
On Jun 13, 2017, at 10:06 AM, Jed Brown wrote:
Adrian Croucher writes:
One way might be to form the whole Jacobian but somehow use a modified
KSP solve which would implement the reduction process, do a KSP solve on
the reduced system
The command below should work on osx aswell..
Satish
On Thu, 15 Jun 2017, Mohamadreza Soltanian wrote:
> Hello Satish
>
> Thank you for your reply. I am using macOS Sierra 10.12.5.
>
> Thanks
>
>
>
> On Thu, Jun 15, 2017 at 12:07 AM, Satish Balay wrote:
>
> > Likely its hanging in gethost
Hello Satish
Thank you for your reply. I am using macOS Sierra 10.12.5.
Thanks
On Thu, Jun 15, 2017 at 12:07 AM, Satish Balay wrote:
> Likely its hanging in gethostbyname() call - which is caued
> by a mismatch in hostname.
>
> What OS are you using? If linux - can you do the following?
>
>
Likely its hanging in gethostbyname() call - which is caued
by a mismatch in hostname.
What OS are you using? If linux - can you do the following?
sudo echo 127.0.0.1 `hostname` >> /etc/hosts
And retry?
Satish
On Thu, 15 Jun 2017, Mohamadreza Soltanian wrote:
> Hello All,
>
> I am trying to
Hello All,
I am trying to install and test PETSC. When I get to the test part, it
seems everything get stuck in the following line. I was wondering if anyone
can help. Thank you
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI
process
> On Jun 8, 2017, at 2:56 PM, Xiangdong wrote:
>
>
> On Thu, Jun 8, 2017 at 3:17 PM, Hong wrote:
> Xiangdong:
> MatCreateMPIBAIJWithArrays() is obviously buggy, and not been tested.
>
>
> 1) In the remark of the function MatCreateMPIBAIJWithArrays, it says " bs -
> the block size, only a bl
You can't do this
ierr = MatSetSizes(A,PETSC_DECIDE,N,N,N);CHKERRQ(ierr);
use PETSC_DECIDE for the third argument
Also this is wrong
for (i = Istart; i < Iend; ++i)
{
ierr = MatSetValue(A,i,i,2,INSERT_VALUES);CHKERRQ(ierr);
ierr = MatSetValue(A,i+1,i,-1,INSERT_VALUES);C
Here is the line that generates an error:
ierr =
MPI_Allreduce(bv->work,y,len,MPIU_SCALAR,MPIU_SUM,PetscObjectComm((PetscObject)bv));CHKERRQ(ierr);
let's see what the MPI error is by running again with the additional command
line option -on_error_abort
hopefully MPI will say something u
--
Regards,
Ramki
On 6/14/17, 5:21 PM, "Barry Smith" wrote:
Send the file
autofs/nccs-svm1_home1/ramki/libraries/slepc-3.7.3/src/sys/classes/bv/interface/bvblas.c
as an attachment.
Barry
> On Jun 14, 2017, at 4:17 PM, Kannan, Ramakrishnan
wrote:
>
Send the file
autofs/nccs-svm1_home1/ramki/libraries/slepc-3.7.3/src/sys/classes/bv/interface/bvblas.c
as an attachment.
Barry
> On Jun 14, 2017, at 4:17 PM, Kannan, Ramakrishnan wrote:
>
> Barry,
>
> Appreciate your kind help. It compiles fine. I am still getting the following
> err
Barry,
Appreciate your kind help. It compiles fine. I am still getting the following
error.
[0]PETSC ERROR: #1 BVDotVec_BLAS_Private() line 272 in
/autofs/nccs-svm1_home1/ramki/libraries/slepc-3.7.3/src/sys/classes/bv/interface/bvblas.c
[0]PETSC ERROR: #2 BVDotVec_Svec() line 150 in
/autofs/nc
> On Jun 14, 2017, at 3:45 PM, Kannan, Ramakrishnan wrote:
>
> Barry,
>
> All the functions here are standard SLEPC functions and there are no
> user-defined or custom code here. As you can see, when I uncomment the
> CHKERRQ macros in my code, I am getting the compilation error.
Yes tha
Barry,
All the functions here are standard SLEPC functions and there are no
user-defined or custom code here. As you can see, when I uncomment the CHKERRQ
macros in my code, I am getting the compilation error.
--
Regards,
Ramki
On 6/14/17, 4:40 PM, "Barry Smith" wrote:
> On Jun
> On Jun 14, 2017, at 3:33 PM, Kannan, Ramakrishnan wrote:
>
> Can I use CHKERRV instead of CHKERRQ? Will that help?
You can do that. But I question you having functions in your code that
return void instead of an error code. Without error codes you are just hurting
your own productivity.
Can I use CHKERRV instead of CHKERRQ? Will that help?
--
Regards,
Ramki
On 6/14/17, 4:25 PM, "Kannan, Ramakrishnan" wrote:
I get the following compilation error when I have CHKERRQ.
/opt/cray/petsc/3.7.4.0/real/GNU64/5.1/sandybridge/include/petscerror.h:433:154:
error: return-
I get the following compilation error when I have CHKERRQ.
/opt/cray/petsc/3.7.4.0/real/GNU64/5.1/sandybridge/include/petscerror.h:433:154:
error: return-statement with a value, in function returning 'void'
[-fpermissive]
#define CHKERRQ(n) do {if (PetscUnlikely(n)) return
PetscErr
http://www.mcs.anl.gov/petsc/documentation/faq.html#efficient-assembly
> On Jun 14, 2017, at 3:18 PM, Jed Brown wrote:
>
> "Kannan, Ramakrishnan" writes:
>
>> I am running NHEP across 16 MPI processors over 16 nodes in a matrix of
>> global size of 1,000,000x1,000,000 with approximately gl
"Kannan, Ramakrishnan" writes:
> I am running NHEP across 16 MPI processors over 16 nodes in a matrix of
> global size of 1,000,000x1,000,000 with approximately global 16,000,000
> non-zeros. Each node has the 1D row distribution of the matrix with exactly
> 62500 rows and 1 million columns w
Why do you have the CHKERRQ(ierr); commented out in your code?
Because of this you are getting mangled confusing error messages.
Put a ierr = in front of all calls and a CHKERRQ(ierr); after each call.
Then resend the new error message which will be much clearer.
> On Jun 14, 2
I am running NHEP across 16 MPI processors over 16 nodes in a matrix of global
size of 1,000,000x1,000,000 with approximately global 16,000,000 non-zeros.
Each node has the 1D row distribution of the matrix with exactly 62500 rows
and 1 million columns with 1million non-zeros as CSR/COO matrix.
On Wed, 14 Jun 2017 at 19:42, David Nolte wrote:
> Dave, thanks a lot for your great answer and for sharing your experience.
> I have a much clearer picture now. :)
>
> The experiments 3/ give the desired results for examples of cavity flow.
> The (1/mu scaled) mass matrix seems OK.
>
> I followe
BTW: you might consider using 'master' branch from petsc git repo. The
fortran module support is revamped in it.
Satish
On Wed, 14 Jun 2017, Satish Balay wrote:
> attaching fixed files.
>
> $ make test
> mpif90 -c -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g
> -I/home/bala
attaching fixed files.
$ make test
mpif90 -c -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g
-I/home/balay/tmp/petsc/include
-I/home/balay/tmp/petsc/arch-linux2-c-debug/include-o my_module.o
my_module.F90
mpif90 -c -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argumen
Hello,
I am a beginner with Petsc and i'm trying
to compile a very simple fortran program "test" with
a calling program in "test.F90" and a module "my_module.F90".
Unfortunately, i do not know how to write properly the makefile
to be able to compile the module with "#include petsc" statements
25 matches
Mail list logo