Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-06 Thread TAY wee-beng

Hi,

Thank you very much. I got it working now.

Yours sincerely,


TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage: http://tayweebeng.wixsite.com/website
Youtube research showcase: 
https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
linkedin: www.linkedin.com/in/tay-weebeng


On 7/9/2018 8:17 AM, Smith, Barry F. wrote:

   Ignore this message; MPIU_INTEGER already exists you can just use it to 
replace MPI_INTEGER for places where you are passing PetscInt through MPI calls.

Barry



On Sep 6, 2018, at 5:46 PM, Smith, Barry F.  wrote:


   I have added MPIU_INTEGER to the branch barry/add-mpiu_integer  so you can 
make your MPI calls portable between 32 and 64 bit indices in Fortran.

   Please let me know if you have any difficulties.

   Barry



On Sep 5, 2018, at 11:58 PM, TAY wee-beng  wrote:


On 6/9/2018 12:09 PM, Smith, Barry F. wrote:

On Sep 5, 2018, at 11:01 PM, Randall Mackie  wrote:

You can use PetscMPIInt for integers in MPI calls.

Check petscsys.h for definitions of all of these.

This is true but can be cumbersome because one may need to convert arrays 
of PetscInt to PetscMPIInt for MPI and then convert back after (with possible 
loss of precision).

 In C we have macros MPIU_INT that we use to indicate that the integer 
argument to the MPI call is 64 bit when 64 bit indices are used and 32 bit 
otherwise allowing users to write portable code that can just be reconfigured 
for 32 or 64 bit integers. I see we do not provide such a thing for Fortran; we 
should provide it. Unless Karl has coding time today we won't be able to get to 
it until tomorrow US time since it is already late in the US.

Barry

Hi,

That would be great! I've no problem waiting a few days. Does it mean that I 
can then use:

call 
MPI_ALLGATHER(counter,1,MPIU_INT,counter_global,1,MPIU_INT,MPI_COMM_WORLD,ierr)

Thanks!

Randy



On Sep 5, 2018, at 8:56 PM, TAY wee-beng  wrote:

Hi,

My code has some problems now after converting to 64bit indices.

After debugging, I realised that I'm using:

call 
MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)

but now counter and counter_global are both 64bit integers. So should I change 
all mpi routine from MPI_INTEGER to MPI_INTEGER8?

But if I switch back to using the 32bit PETSc, do I have to switch back again? 
In that case, does it mean I need to have 2 copies of my code - one to compile 
with PETSc 32, another to compile with PETSc 64?

Is there an easier way?
Thank you very much.

Yours sincerely,


TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage:
http://tayweebeng.wixsite.com/website

Youtube research showcase:
https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA

linkedin:
www.linkedin.com/in/tay-weebeng



On 5/9/2018 6:25 PM, Matthew Knepley wrote:

On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng  wrote:

On 31/8/2018 10:43 AM, Smith, Barry F. wrote:

On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:


On 31/8/2018 10:38 AM, Smith, Barry F. wrote:

   PetscReal is by default real(8) you can leave those alone

Any integer you pass to a PETSc routine needs to be declared as PetscInt 
(not integer) otherwise the 64 bit indices stuff won't work.

Barry


Hi,

ok, I got it. Btw, is it advisable to change all integer in my code to PetscInt?

Will it cause any conflict or waste a lot of memory?

Or should I only change those related to PETSc?

 That is up to you. Since you probably pass the values between PETSc and 
non-PETSc part of the code it is probably easier just to make all the integer 
PetscInt instead. No performance difference that you can measure by keeping a 
few integer around.

 Barry

Hi,

For some small parts of the code, it is preferred to use integer
instead. Btw, to force variable as integer, I can use int(aa). However,
I tried to force variable as PetscInt using PetscInt(aa) but it can't work.

Is there any way I can make it work?

I think you just define a PetscInt variable and use assignment.

   Matt
Thanks.

Thanks!

On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:


On 31/8/2018 10:21 AM, Matthew Knepley wrote:

On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  wrote:
Hi,

Due to my increase grid size, I have to go 64bit. I compiled the 64bit
PETSc w/o error. However, when I tried to compile my code using the
64bit PETSc, I got the error below. May I know why is this so?

What changes should I make?

Is it possible that you did not declare some inputs as PetscInt, so the 
interface check is failing?

Matt

Hi,

I'm using the standard

integer ::

real(8) ::

for some variables. For some others relating to PETSc, I use PetscInt.

Should I change all to PetscInt and PetscReal?

Currently, I use real(8) for all real values. If I change all to PetscReal, 
will PetscR

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-06 Thread Smith, Barry F.

  Ignore this message; MPIU_INTEGER already exists you can just use it to 
replace MPI_INTEGER for places where you are passing PetscInt through MPI calls.

   Barry


> On Sep 6, 2018, at 5:46 PM, Smith, Barry F.  wrote:
> 
> 
>   I have added MPIU_INTEGER to the branch barry/add-mpiu_integer  so you can 
> make your MPI calls portable between 32 and 64 bit indices in Fortran.
> 
>   Please let me know if you have any difficulties.
> 
>   Barry
> 
> 
>> On Sep 5, 2018, at 11:58 PM, TAY wee-beng  wrote:
>> 
>> 
>> On 6/9/2018 12:09 PM, Smith, Barry F. wrote:
>>> 
 On Sep 5, 2018, at 11:01 PM, Randall Mackie  wrote:
 
 You can use PetscMPIInt for integers in MPI calls.
 
 Check petscsys.h for definitions of all of these.
>>>This is true but can be cumbersome because one may need to convert 
>>> arrays of PetscInt to PetscMPIInt for MPI and then convert back after (with 
>>> possible loss of precision).
>>> 
>>> In C we have macros MPIU_INT that we use to indicate that the integer 
>>> argument to the MPI call is 64 bit when 64 bit indices are used and 32 bit 
>>> otherwise allowing users to write portable code that can just be 
>>> reconfigured for 32 or 64 bit integers. I see we do not provide such a 
>>> thing for Fortran; we should provide it. Unless Karl has coding time today 
>>> we won't be able to get to it until tomorrow US time since it is already 
>>> late in the US.
>>> 
>>>Barry
>> Hi,
>> 
>> That would be great! I've no problem waiting a few days. Does it mean that I 
>> can then use:
>> 
>> call 
>> MPI_ALLGATHER(counter,1,MPIU_INT,counter_global,1,MPIU_INT,MPI_COMM_WORLD,ierr)
>> 
>> Thanks!
 Randy
 
 
> On Sep 5, 2018, at 8:56 PM, TAY wee-beng  wrote:
> 
> Hi,
> 
> My code has some problems now after converting to 64bit indices.
> 
> After debugging, I realised that I'm using:
> 
> call 
> MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)
> 
> but now counter and counter_global are both 64bit integers. So should I 
> change all mpi routine from MPI_INTEGER to MPI_INTEGER8?
> 
> But if I switch back to using the 32bit PETSc, do I have to switch back 
> again? In that case, does it mean I need to have 2 copies of my code - 
> one to compile with PETSc 32, another to compile with PETSc 64?
> 
> Is there an easier way?
> Thank you very much.
> 
> Yours sincerely,
> 
> 
> TAY Wee-Beng (Zheng Weiming) 郑伟明
> Personal research webpage:
> http://tayweebeng.wixsite.com/website
> 
> Youtube research showcase:
> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
> 
> linkedin:
> www.linkedin.com/in/tay-weebeng
> 
> 
> 
> On 5/9/2018 6:25 PM, Matthew Knepley wrote:
>> On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng  wrote:
>> 
>> On 31/8/2018 10:43 AM, Smith, Barry F. wrote:
 On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:
 
 
 On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
>   PetscReal is by default real(8) you can leave those alone
> 
>Any integer you pass to a PETSc routine needs to be declared as 
> PetscInt (not integer) otherwise the 64 bit indices stuff won't work.
> 
>Barry
> 
 Hi,
 
 ok, I got it. Btw, is it advisable to change all integer in my code to 
 PetscInt?
 
 Will it cause any conflict or waste a lot of memory?
 
 Or should I only change those related to PETSc?
>>> That is up to you. Since you probably pass the values between PETSc 
>>> and non-PETSc part of the code it is probably easier just to make all 
>>> the integer PetscInt instead. No performance difference that you can 
>>> measure by keeping a few integer around.
>>> 
>>> Barry
>> Hi,
>> 
>> For some small parts of the code, it is preferred to use integer
>> instead. Btw, to force variable as integer, I can use int(aa). However,
>> I tried to force variable as PetscInt using PetscInt(aa) but it can't 
>> work.
>> 
>> Is there any way I can make it work?
>> 
>> I think you just define a PetscInt variable and use assignment.
>> 
>>   Matt
>> Thanks.
 Thanks!
>> On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:
>> 
>> 
>> On 31/8/2018 10:21 AM, Matthew Knepley wrote:
>>> On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  
>>> wrote:
>>> Hi,
>>> 
>>> Due to my increase grid size, I have to go 64bit. I compiled the 
>>> 64bit
>>> PETSc w/o error. However, when I tried to compile my code using the
>>> 64bit PETSc, I got the error below. May I know why is this s

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-06 Thread Smith, Barry F.

   I have added MPIU_INTEGER to the branch barry/add-mpiu_integer  so you can 
make your MPI calls portable between 32 and 64 bit indices in Fortran.

   Please let me know if you have any difficulties.

   Barry


> On Sep 5, 2018, at 11:58 PM, TAY wee-beng  wrote:
> 
> 
> On 6/9/2018 12:09 PM, Smith, Barry F. wrote:
>> 
>>> On Sep 5, 2018, at 11:01 PM, Randall Mackie  wrote:
>>> 
>>> You can use PetscMPIInt for integers in MPI calls.
>>> 
>>> Check petscsys.h for definitions of all of these.
>> This is true but can be cumbersome because one may need to convert 
>> arrays of PetscInt to PetscMPIInt for MPI and then convert back after (with 
>> possible loss of precision).
>> 
>>  In C we have macros MPIU_INT that we use to indicate that the integer 
>> argument to the MPI call is 64 bit when 64 bit indices are used and 32 bit 
>> otherwise allowing users to write portable code that can just be 
>> reconfigured for 32 or 64 bit integers. I see we do not provide such a thing 
>> for Fortran; we should provide it. Unless Karl has coding time today we 
>> won't be able to get to it until tomorrow US time since it is already late 
>> in the US.
>> 
>> Barry
> Hi,
> 
> That would be great! I've no problem waiting a few days. Does it mean that I 
> can then use:
> 
> call 
> MPI_ALLGATHER(counter,1,MPIU_INT,counter_global,1,MPIU_INT,MPI_COMM_WORLD,ierr)
> 
> Thanks!
>>> Randy
>>> 
>>> 
 On Sep 5, 2018, at 8:56 PM, TAY wee-beng  wrote:
 
 Hi,
 
 My code has some problems now after converting to 64bit indices.
 
 After debugging, I realised that I'm using:
 
 call 
 MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)
 
 but now counter and counter_global are both 64bit integers. So should I 
 change all mpi routine from MPI_INTEGER to MPI_INTEGER8?
 
 But if I switch back to using the 32bit PETSc, do I have to switch back 
 again? In that case, does it mean I need to have 2 copies of my code - one 
 to compile with PETSc 32, another to compile with PETSc 64?
 
 Is there an easier way?
 Thank you very much.
 
 Yours sincerely,
 
 
 TAY Wee-Beng (Zheng Weiming) 郑伟明
 Personal research webpage:
 http://tayweebeng.wixsite.com/website
 
 Youtube research showcase:
 https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
 
 linkedin:
 www.linkedin.com/in/tay-weebeng
 
 
 
 On 5/9/2018 6:25 PM, Matthew Knepley wrote:
> On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng  wrote:
> 
> On 31/8/2018 10:43 AM, Smith, Barry F. wrote:
>>> On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:
>>> 
>>> 
>>> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
PetscReal is by default real(8) you can leave those alone
 
 Any integer you pass to a PETSc routine needs to be declared as 
 PetscInt (not integer) otherwise the 64 bit indices stuff won't work.
 
 Barry
 
>>> Hi,
>>> 
>>> ok, I got it. Btw, is it advisable to change all integer in my code to 
>>> PetscInt?
>>> 
>>> Will it cause any conflict or waste a lot of memory?
>>> 
>>> Or should I only change those related to PETSc?
>>  That is up to you. Since you probably pass the values between PETSc 
>> and non-PETSc part of the code it is probably easier just to make all 
>> the integer PetscInt instead. No performance difference that you can 
>> measure by keeping a few integer around.
>> 
>>  Barry
> Hi,
> 
> For some small parts of the code, it is preferred to use integer
> instead. Btw, to force variable as integer, I can use int(aa). However,
> I tried to force variable as PetscInt using PetscInt(aa) but it can't 
> work.
> 
> Is there any way I can make it work?
> 
> I think you just define a PetscInt variable and use assignment.
> 
>Matt
>  Thanks.
>>> Thanks!
> On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:
> 
> 
> On 31/8/2018 10:21 AM, Matthew Knepley wrote:
>> On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  
>> wrote:
>> Hi,
>> 
>> Due to my increase grid size, I have to go 64bit. I compiled the 
>> 64bit
>> PETSc w/o error. However, when I tried to compile my code using the
>> 64bit PETSc, I got the error below. May I know why is this so?
>> 
>> What changes should I make?
>> 
>> Is it possible that you did not declare some inputs as PetscInt, so 
>> the interface check is failing?
>> 
>> Matt
> Hi,
> 
> I'm using the standard
> 
> integer ::
> 
> real(8) ::
> 
>>>

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-06 Thread Karl Rupp

Hey,


  In C we have macros MPIU_INT that we use to indicate that the integer 
argument to the MPI call is 64 bit when 64 bit indices are used and 32 bit 
otherwise allowing users to write portable code that can just be reconfigured 
for 32 or 64 bit integers. I see we do not provide such a thing for Fortran; we 
should provide it. Unless Karl has coding time today we won't be able to get to 
it until tomorrow US time since it is already late in the US.


well, I'm not sufficiently experienced with the Fortran bindings (yet) 
to make this happen today...


Best regards,
Karli



Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-05 Thread Smith, Barry F.


> On Sep 5, 2018, at 11:58 PM, TAY wee-beng  wrote:
> 
> 
> On 6/9/2018 12:09 PM, Smith, Barry F. wrote:
>> 
>>> On Sep 5, 2018, at 11:01 PM, Randall Mackie  wrote:
>>> 
>>> You can use PetscMPIInt for integers in MPI calls.
>>> 
>>> Check petscsys.h for definitions of all of these.
>> This is true but can be cumbersome because one may need to convert 
>> arrays of PetscInt to PetscMPIInt for MPI and then convert back after (with 
>> possible loss of precision).
>> 
>>  In C we have macros MPIU_INT that we use to indicate that the integer 
>> argument to the MPI call is 64 bit when 64 bit indices are used and 32 bit 
>> otherwise allowing users to write portable code that can just be 
>> reconfigured for 32 or 64 bit integers. I see we do not provide such a thing 
>> for Fortran; we should provide it. Unless Karl has coding time today we 
>> won't be able to get to it until tomorrow US time since it is already late 
>> in the US.
>> 
>> Barry
> Hi,
> 
> That would be great! I've no problem waiting a few days. Does it mean that I 
> can then use:
> 
> call 
> MPI_ALLGATHER(counter,1,MPIU_INT,counter_global,1,MPIU_INT,MPI_COMM_WORLD,ierr)

   Likely to be consistent with Fortran MPI standards we would use MPI_INTEGER 
but otherwise yes.

> 
> Thanks!
>>> Randy
>>> 
>>> 
 On Sep 5, 2018, at 8:56 PM, TAY wee-beng  wrote:
 
 Hi,
 
 My code has some problems now after converting to 64bit indices.
 
 After debugging, I realised that I'm using:
 
 call 
 MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)
 
 but now counter and counter_global are both 64bit integers. So should I 
 change all mpi routine from MPI_INTEGER to MPI_INTEGER8?
 
 But if I switch back to using the 32bit PETSc, do I have to switch back 
 again? In that case, does it mean I need to have 2 copies of my code - one 
 to compile with PETSc 32, another to compile with PETSc 64?
 
 Is there an easier way?
 Thank you very much.
 
 Yours sincerely,
 
 
 TAY Wee-Beng (Zheng Weiming) 郑伟明
 Personal research webpage:
 http://tayweebeng.wixsite.com/website
 
 Youtube research showcase:
 https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
 
 linkedin:
 www.linkedin.com/in/tay-weebeng
 
 
 
 On 5/9/2018 6:25 PM, Matthew Knepley wrote:
> On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng  wrote:
> 
> On 31/8/2018 10:43 AM, Smith, Barry F. wrote:
>>> On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:
>>> 
>>> 
>>> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
PetscReal is by default real(8) you can leave those alone
 
 Any integer you pass to a PETSc routine needs to be declared as 
 PetscInt (not integer) otherwise the 64 bit indices stuff won't work.
 
 Barry
 
>>> Hi,
>>> 
>>> ok, I got it. Btw, is it advisable to change all integer in my code to 
>>> PetscInt?
>>> 
>>> Will it cause any conflict or waste a lot of memory?
>>> 
>>> Or should I only change those related to PETSc?
>>  That is up to you. Since you probably pass the values between PETSc 
>> and non-PETSc part of the code it is probably easier just to make all 
>> the integer PetscInt instead. No performance difference that you can 
>> measure by keeping a few integer around.
>> 
>>  Barry
> Hi,
> 
> For some small parts of the code, it is preferred to use integer
> instead. Btw, to force variable as integer, I can use int(aa). However,
> I tried to force variable as PetscInt using PetscInt(aa) but it can't 
> work.
> 
> Is there any way I can make it work?
> 
> I think you just define a PetscInt variable and use assignment.
> 
>Matt
>  Thanks.
>>> Thanks!
> On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:
> 
> 
> On 31/8/2018 10:21 AM, Matthew Knepley wrote:
>> On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  
>> wrote:
>> Hi,
>> 
>> Due to my increase grid size, I have to go 64bit. I compiled the 
>> 64bit
>> PETSc w/o error. However, when I tried to compile my code using the
>> 64bit PETSc, I got the error below. May I know why is this so?
>> 
>> What changes should I make?
>> 
>> Is it possible that you did not declare some inputs as PetscInt, so 
>> the interface check is failing?
>> 
>> Matt
> Hi,
> 
> I'm using the standard
> 
> integer ::
> 
> real(8) ::
> 
> for some variables. For some others relating to PETSc, I use PetscInt.
> 
> Should I chang

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-05 Thread TAY wee-beng



On 6/9/2018 12:09 PM, Smith, Barry F. wrote:



On Sep 5, 2018, at 11:01 PM, Randall Mackie  wrote:

You can use PetscMPIInt for integers in MPI calls.

Check petscsys.h for definitions of all of these.

 This is true but can be cumbersome because one may need to convert arrays 
of PetscInt to PetscMPIInt for MPI and then convert back after (with possible 
loss of precision).

  In C we have macros MPIU_INT that we use to indicate that the integer 
argument to the MPI call is 64 bit when 64 bit indices are used and 32 bit 
otherwise allowing users to write portable code that can just be reconfigured 
for 32 or 64 bit integers. I see we do not provide such a thing for Fortran; we 
should provide it. Unless Karl has coding time today we won't be able to get to 
it until tomorrow US time since it is already late in the US.

 Barry

Hi,

That would be great! I've no problem waiting a few days. Does it mean 
that I can then use:


call 
MPI_ALLGATHER(counter,1,MPIU_INT,counter_global,1,MPIU_INT,MPI_COMM_WORLD,ierr)


Thanks!

Randy



On Sep 5, 2018, at 8:56 PM, TAY wee-beng  wrote:

Hi,

My code has some problems now after converting to 64bit indices.

After debugging, I realised that I'm using:

call 
MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)

but now counter and counter_global are both 64bit integers. So should I change 
all mpi routine from MPI_INTEGER to MPI_INTEGER8?

But if I switch back to using the 32bit PETSc, do I have to switch back again? 
In that case, does it mean I need to have 2 copies of my code - one to compile 
with PETSc 32, another to compile with PETSc 64?

Is there an easier way?
Thank you very much.

Yours sincerely,


TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage:
http://tayweebeng.wixsite.com/website

Youtube research showcase:
https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA

linkedin:
www.linkedin.com/in/tay-weebeng



On 5/9/2018 6:25 PM, Matthew Knepley wrote:

On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng  wrote:

On 31/8/2018 10:43 AM, Smith, Barry F. wrote:

On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:


On 31/8/2018 10:38 AM, Smith, Barry F. wrote:

PetscReal is by default real(8) you can leave those alone

 Any integer you pass to a PETSc routine needs to be declared as PetscInt 
(not integer) otherwise the 64 bit indices stuff won't work.

 Barry


Hi,

ok, I got it. Btw, is it advisable to change all integer in my code to PetscInt?

Will it cause any conflict or waste a lot of memory?

Or should I only change those related to PETSc?

  That is up to you. Since you probably pass the values between PETSc and 
non-PETSc part of the code it is probably easier just to make all the integer 
PetscInt instead. No performance difference that you can measure by keeping a 
few integer around.

  Barry

Hi,

For some small parts of the code, it is preferred to use integer
instead. Btw, to force variable as integer, I can use int(aa). However,
I tried to force variable as PetscInt using PetscInt(aa) but it can't work.

Is there any way I can make it work?

I think you just define a PetscInt variable and use assignment.

Matt
  
Thanks.

Thanks!

On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:


On 31/8/2018 10:21 AM, Matthew Knepley wrote:

On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  wrote:
Hi,

Due to my increase grid size, I have to go 64bit. I compiled the 64bit
PETSc w/o error. However, when I tried to compile my code using the
64bit PETSc, I got the error below. May I know why is this so?

What changes should I make?

Is it possible that you did not declare some inputs as PetscInt, so the 
interface check is failing?

 Matt

Hi,

I'm using the standard

integer ::

real(8) ::

for some variables. For some others relating to PETSc, I use PetscInt.

Should I change all to PetscInt and PetscReal?

Currently, I use real(8) for all real values. If I change all to PetscReal, 
will PetscReal be real or real(8) by default?

Thanks!
   
[tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018

/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
-g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
-g -ip -ipo -O3 -c -fPIC  -save  -w
-I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
-I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
global.F90
global.F90(979): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(989): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
   call
DMDACre

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-05 Thread Smith, Barry F.


> On Sep 5, 2018, at 11:01 PM, Randall Mackie  wrote:
> 
> You can use PetscMPIInt for integers in MPI calls.
> 
> Check petscsys.h for definitions of all of these.

This is true but can be cumbersome because one may need to convert arrays 
of PetscInt to PetscMPIInt for MPI and then convert back after (with possible 
loss of precision).

 In C we have macros MPIU_INT that we use to indicate that the integer 
argument to the MPI call is 64 bit when 64 bit indices are used and 32 bit 
otherwise allowing users to write portable code that can just be reconfigured 
for 32 or 64 bit integers. I see we do not provide such a thing for Fortran; we 
should provide it. Unless Karl has coding time today we won't be able to get to 
it until tomorrow US time since it is already late in the US.

Barry

> 
> Randy
> 
> 
>> On Sep 5, 2018, at 8:56 PM, TAY wee-beng  wrote:
>> 
>> Hi,
>> 
>> My code has some problems now after converting to 64bit indices.
>> 
>> After debugging, I realised that I'm using:
>> 
>> call 
>> MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)
>> 
>> but now counter and counter_global are both 64bit integers. So should I 
>> change all mpi routine from MPI_INTEGER to MPI_INTEGER8?
>> 
>> But if I switch back to using the 32bit PETSc, do I have to switch back 
>> again? In that case, does it mean I need to have 2 copies of my code - one 
>> to compile with PETSc 32, another to compile with PETSc 64?
>> 
>> Is there an easier way?
>> Thank you very much.
>> 
>> Yours sincerely,
>> 
>> 
>> TAY Wee-Beng (Zheng Weiming) 郑伟明
>> Personal research webpage: 
>> http://tayweebeng.wixsite.com/website
>> 
>> Youtube research showcase: 
>> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
>> 
>> linkedin: 
>> www.linkedin.com/in/tay-weebeng
>> 
>> 
>> 
>> On 5/9/2018 6:25 PM, Matthew Knepley wrote:
>>> On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng  wrote:
>>> 
>>> On 31/8/2018 10:43 AM, Smith, Barry F. wrote:
>>> >
>>> >> On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:
>>> >>
>>> >>
>>> >> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
>>> >>>PetscReal is by default real(8) you can leave those alone
>>> >>>
>>> >>> Any integer you pass to a PETSc routine needs to be declared as 
>>> >>> PetscInt (not integer) otherwise the 64 bit indices stuff won't work.
>>> >>>
>>> >>> Barry
>>> >>>
>>> >> Hi,
>>> >>
>>> >> ok, I got it. Btw, is it advisable to change all integer in my code to 
>>> >> PetscInt?
>>> >>
>>> >> Will it cause any conflict or waste a lot of memory?
>>> >>
>>> >> Or should I only change those related to PETSc?
>>> >  That is up to you. Since you probably pass the values between PETSc 
>>> > and non-PETSc part of the code it is probably easier just to make all the 
>>> > integer PetscInt instead. No performance difference that you can measure 
>>> > by keeping a few integer around.
>>> >
>>> >  Barry
>>> Hi,
>>> 
>>> For some small parts of the code, it is preferred to use integer 
>>> instead. Btw, to force variable as integer, I can use int(aa). However, 
>>> I tried to force variable as PetscInt using PetscInt(aa) but it can't work.
>>> 
>>> Is there any way I can make it work?
>>> 
>>> I think you just define a PetscInt variable and use assignment.
>>> 
>>>Matt
>>>  
>>> Thanks.
>>> >> Thanks!
>>>  On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:
>>> 
>>> 
>>>  On 31/8/2018 10:21 AM, Matthew Knepley wrote:
>>> > On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  
>>> > wrote:
>>> > Hi,
>>> >
>>> > Due to my increase grid size, I have to go 64bit. I compiled the 64bit
>>> > PETSc w/o error. However, when I tried to compile my code using the
>>> > 64bit PETSc, I got the error below. May I know why is this so?
>>> >
>>> > What changes should I make?
>>> >
>>> > Is it possible that you did not declare some inputs as PetscInt, so 
>>> > the interface check is failing?
>>> >
>>> > Matt
>>>  Hi,
>>> 
>>>  I'm using the standard
>>> 
>>>  integer ::
>>> 
>>>  real(8) ::
>>> 
>>>  for some variables. For some others relating to PETSc, I use PetscInt.
>>> 
>>>  Should I change all to PetscInt and PetscReal?
>>> 
>>>  Currently, I use real(8) for all real values. If I change all to 
>>>  PetscReal, will PetscReal be real or real(8) by default?
>>> 
>>>  Thanks!
>>> >   
>>> > [tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
>>> > /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
>>> > -g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
>>> > /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
>>> > -g -ip -ipo -O3 -c -fPIC  -save  -w
>>> > -I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-05 Thread Randall Mackie
You can use PetscMPIInt for integers in MPI calls.

Check petscsys.h for definitions of all of these.

Randy


> On Sep 5, 2018, at 8:56 PM, TAY wee-beng  wrote:
> 
> Hi,
> 
> My code has some problems now after converting to 64bit indices.
> 
> After debugging, I realised that I'm using:
> 
> call 
> MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)
> 
> but now counter and counter_global are both 64bit integers. So should I 
> change all mpi routine from MPI_INTEGER to MPI_INTEGER8?
> 
> But if I switch back to using the 32bit PETSc, do I have to switch back 
> again? In that case, does it mean I need to have 2 copies of my code - one to 
> compile with PETSc 32, another to compile with PETSc 64?
> 
> Is there an easier way?
> Thank you very much.
> 
> Yours sincerely,
> 
> 
> TAY Wee-Beng (Zheng Weiming) 郑伟明
> Personal research webpage: http://tayweebeng.wixsite.com/website 
> 
> Youtube research showcase: 
> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA 
> 
> linkedin: www.linkedin.com/in/tay-weebeng 
> 
> 
> On 5/9/2018 6:25 PM, Matthew Knepley wrote:
>> On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng > > wrote:
>> 
>> On 31/8/2018 10:43 AM, Smith, Barry F. wrote:
>> >
>> >> On Aug 30, 2018, at 9:40 PM, TAY wee-beng > >> > wrote:
>> >>
>> >>
>> >> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
>> >>>PetscReal is by default real(8) you can leave those alone
>> >>>
>> >>> Any integer you pass to a PETSc routine needs to be declared as 
>> >>> PetscInt (not integer) otherwise the 64 bit indices stuff won't work.
>> >>>
>> >>> Barry
>> >>>
>> >> Hi,
>> >>
>> >> ok, I got it. Btw, is it advisable to change all integer in my code to 
>> >> PetscInt?
>> >>
>> >> Will it cause any conflict or waste a lot of memory?
>> >>
>> >> Or should I only change those related to PETSc?
>> >  That is up to you. Since you probably pass the values between PETSc 
>> > and non-PETSc part of the code it is probably easier just to make all the 
>> > integer PetscInt instead. No performance difference that you can measure 
>> > by keeping a few integer around.
>> >
>> >  Barry
>> Hi,
>> 
>> For some small parts of the code, it is preferred to use integer 
>> instead. Btw, to force variable as integer, I can use int(aa). However, 
>> I tried to force variable as PetscInt using PetscInt(aa) but it can't work.
>> 
>> Is there any way I can make it work?
>> 
>> I think you just define a PetscInt variable and use assignment.
>> 
>>Matt
>>  
>> Thanks.
>> >> Thanks!
>>  On Aug 30, 2018, at 9:35 PM, TAY wee-beng >  > wrote:
>> 
>> 
>>  On 31/8/2018 10:21 AM, Matthew Knepley wrote:
>> > On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng > > > wrote:
>> > Hi,
>> >
>> > Due to my increase grid size, I have to go 64bit. I compiled the 64bit
>> > PETSc w/o error. However, when I tried to compile my code using the
>> > 64bit PETSc, I got the error below. May I know why is this so?
>> >
>> > What changes should I make?
>> >
>> > Is it possible that you did not declare some inputs as PetscInt, so 
>> > the interface check is failing?
>> >
>> > Matt
>>  Hi,
>> 
>>  I'm using the standard
>> 
>>  integer ::
>> 
>>  real(8) ::
>> 
>>  for some variables. For some others relating to PETSc, I use PetscInt.
>> 
>>  Should I change all to PetscInt and PetscReal?
>> 
>>  Currently, I use real(8) for all real values. If I change all to 
>>  PetscReal, will PetscReal be real or real(8) by default?
>> 
>>  Thanks!
>> >   
>> > [tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
>> > /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
>> > -g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
>> > /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
>> > -g -ip -ipo -O3 -c -fPIC  -save  -w
>> > -I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
>> > -I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
>> > global.F90
>> > global.F90(979): error #6285: There is no matching specific subroutine
>> > for this generic subroutine call.   [DMDACREATE3D]
>> > call
>> > DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>> > -^
>> > global.F90(989): error #6285: There is no matching specific subroutine
>> > for this generic subroutine call.   [DMDACREATE3D]
>> >   call
>> > DMDACreate3d(M

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-05 Thread TAY wee-beng

Hi,

My code has some problems now after converting to 64bit indices.

After debugging, I realised that I'm using:

call 
MPI_ALLGATHER(counter,1,MPI_INTEGER,counter_global,1,MPI_INTEGER,MPI_COMM_WORLD,ierr)


but now counter and counter_global are both 64bit integers. So should I 
change all mpi routine from MPI_INTEGER to MPI_INTEGER8?


But if I switch back to using the 32bit PETSc, do I have to switch back 
again? In that case, does it mean I need to have 2 copies of my code - 
one to compile with PETSc 32, another to compile with PETSc 64?


Is there an easier way?

Thank you very much.

Yours sincerely,


TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage: http://tayweebeng.wixsite.com/website
Youtube research showcase: 
https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
linkedin: www.linkedin.com/in/tay-weebeng


On 5/9/2018 6:25 PM, Matthew Knepley wrote:
On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng > wrote:



On 31/8/2018 10:43 AM, Smith, Barry F. wrote:
>
>> On Aug 30, 2018, at 9:40 PM, TAY wee-beng mailto:zon...@gmail.com>> wrote:
>>
>>
>> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
>>>    PetscReal is by default real(8) you can leave those alone
>>>
>>>     Any integer you pass to a PETSc routine needs to be
declared as PetscInt (not integer) otherwise the 64 bit indices
stuff won't work.
>>>
>>>     Barry
>>>
>> Hi,
>>
>> ok, I got it. Btw, is it advisable to change all integer in my
code to PetscInt?
>>
>> Will it cause any conflict or waste a lot of memory?
>>
>> Or should I only change those related to PETSc?
>      That is up to you. Since you probably pass the values
between PETSc and non-PETSc part of the code it is probably easier
just to make all the integer PetscInt instead. No performance
difference that you can measure by keeping a few integer around.
>
>      Barry
Hi,

For some small parts of the code, it is preferred to use integer
instead. Btw, to force variable as integer, I can use int(aa).
However,
I tried to force variable as PetscInt using PetscInt(aa) but it
can't work.

Is there any way I can make it work?


I think you just define a PetscInt variable and use assignment.

   Matt

Thanks.
>> Thanks!
 On Aug 30, 2018, at 9:35 PM, TAY wee-beng mailto:zon...@gmail.com>> wrote:


 On 31/8/2018 10:21 AM, Matthew Knepley wrote:
> On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng
mailto:zon...@gmail.com>> wrote:
> Hi,
>
> Due to my increase grid size, I have to go 64bit. I compiled
the 64bit
> PETSc w/o error. However, when I tried to compile my code
using the
> 64bit PETSc, I got the error below. May I know why is this so?
>
> What changes should I make?
>
> Is it possible that you did not declare some inputs as
PetscInt, so the interface check is failing?
>
>     Matt
 Hi,

 I'm using the standard

 integer ::

 real(8) ::

 for some variables. For some others relating to PETSc, I use
PetscInt.

 Should I change all to PetscInt and PetscReal?

 Currently, I use real(8) for all real values. If I change all
to PetscReal, will PetscReal be real or real(8) by default?

 Thanks!
>
> [tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
>

/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
> -g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
>

/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
> -g -ip -ipo -O3 -c -fPIC  -save  -w
>

-I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
>

-I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
> global.F90
> global.F90(979): error #6285: There is no matching specific
subroutine
> for this generic subroutine call.  [DMDACREATE3D]
> call
>

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(989): error #6285: There is no matching specific
subroutine
> for this generic subroutine call.  [DMDACREATE3D]
>       call
>

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(997): error #6285: There is no matching specific
subroutine
> for this generic subroutine call.  [DMDACREATE3D]
>       call
>

DM

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-05 Thread Matthew Knepley
On Wed, Sep 5, 2018 at 3:27 AM TAY wee-beng  wrote:

>
> On 31/8/2018 10:43 AM, Smith, Barry F. wrote:
> >
> >> On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:
> >>
> >>
> >> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
> >>>PetscReal is by default real(8) you can leave those alone
> >>>
> >>> Any integer you pass to a PETSc routine needs to be declared as
> PetscInt (not integer) otherwise the 64 bit indices stuff won't work.
> >>>
> >>> Barry
> >>>
> >> Hi,
> >>
> >> ok, I got it. Btw, is it advisable to change all integer in my code to
> PetscInt?
> >>
> >> Will it cause any conflict or waste a lot of memory?
> >>
> >> Or should I only change those related to PETSc?
> >  That is up to you. Since you probably pass the values between PETSc
> and non-PETSc part of the code it is probably easier just to make all the
> integer PetscInt instead. No performance difference that you can measure by
> keeping a few integer around.
> >
> >  Barry
> Hi,
>
> For some small parts of the code, it is preferred to use integer
> instead. Btw, to force variable as integer, I can use int(aa). However,
> I tried to force variable as PetscInt using PetscInt(aa) but it can't work.
>
> Is there any way I can make it work?
>

I think you just define a PetscInt variable and use assignment.

   Matt


> Thanks.
> >> Thanks!
>  On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:
> 
> 
>  On 31/8/2018 10:21 AM, Matthew Knepley wrote:
> > On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng 
> wrote:
> > Hi,
> >
> > Due to my increase grid size, I have to go 64bit. I compiled the
> 64bit
> > PETSc w/o error. However, when I tried to compile my code using the
> > 64bit PETSc, I got the error below. May I know why is this so?
> >
> > What changes should I make?
> >
> > Is it possible that you did not declare some inputs as PetscInt, so
> the interface check is failing?
> >
> > Matt
>  Hi,
> 
>  I'm using the standard
> 
>  integer ::
> 
>  real(8) ::
> 
>  for some variables. For some others relating to PETSc, I use PetscInt.
> 
>  Should I change all to PetscInt and PetscReal?
> 
>  Currently, I use real(8) for all real values. If I change all to
> PetscReal, will PetscReal be real or real(8) by default?
> 
>  Thanks!
> >
> > [tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
> >
> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
> > -g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
> >
> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
> > -g -ip -ipo -O3 -c -fPIC  -save  -w
> >
> -I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
> >
> -I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
> > global.F90
> > global.F90(979): error #6285: There is no matching specific
> subroutine
> > for this generic subroutine call.   [DMDACREATE3D]
> > call
> >
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> > -^
> > global.F90(989): error #6285: There is no matching specific
> subroutine
> > for this generic subroutine call.   [DMDACREATE3D]
> >   call
> >
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> > -^
> > global.F90(997): error #6285: There is no matching specific
> subroutine
> > for this generic subroutine call.   [DMDACREATE3D]
> >   call
> >
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> > -^
> > global.F90(1005): error #6285: There is no matching specific
> subroutine
> > for this generic subroutine call.   [DMDACREATE3D]
> >   call
> >
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> > -^
> > global.F90(1013): error #6285: There is no matching specific
> subroutine
> > for this generic subroutine call.   [DMDACREATE3D]
> >   call
> >
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> > -^
> > global.F90(1021): error #6285: There is no matching specific
> subroutine
> > for this generic subroutine call.   [DMDACREATE3D]
> >   call
> >
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> > -^
> > global.F90(1029): error #6285: There is no matching specific
> subroutine
> > for this generic subroutine call.   [DMDACREATE3D]
> >   call
> >
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,si

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-09-05 Thread TAY wee-beng



On 31/8/2018 10:43 AM, Smith, Barry F. wrote:



On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:


On 31/8/2018 10:38 AM, Smith, Barry F. wrote:

   PetscReal is by default real(8) you can leave those alone

Any integer you pass to a PETSc routine needs to be declared as PetscInt 
(not integer) otherwise the 64 bit indices stuff won't work.

Barry


Hi,

ok, I got it. Btw, is it advisable to change all integer in my code to PetscInt?

Will it cause any conflict or waste a lot of memory?

Or should I only change those related to PETSc?

 That is up to you. Since you probably pass the values between PETSc and 
non-PETSc part of the code it is probably easier just to make all the integer 
PetscInt instead. No performance difference that you can measure by keeping a 
few integer around.

 Barry

Hi,

For some small parts of the code, it is preferred to use integer 
instead. Btw, to force variable as integer, I can use int(aa). However, 
I tried to force variable as PetscInt using PetscInt(aa) but it can't work.


Is there any way I can make it work?

Thanks.

Thanks!

On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:


On 31/8/2018 10:21 AM, Matthew Knepley wrote:

On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  wrote:
Hi,

Due to my increase grid size, I have to go 64bit. I compiled the 64bit
PETSc w/o error. However, when I tried to compile my code using the
64bit PETSc, I got the error below. May I know why is this so?

What changes should I make?

Is it possible that you did not declare some inputs as PetscInt, so the 
interface check is failing?

Matt

Hi,

I'm using the standard

integer ::

real(8) ::

for some variables. For some others relating to PETSc, I use PetscInt.

Should I change all to PetscInt and PetscReal?

Currently, I use real(8) for all real values. If I change all to PetscReal, 
will PetscReal be real or real(8) by default?

Thanks!
  
[tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018

/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
-g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
-g -ip -ipo -O3 -c -fPIC  -save  -w
-I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
-I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
global.F90
global.F90(979): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(989): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(997): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1005): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1013): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1021): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1029): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
compilation aborted for global.F90 (code 1)

--
Thank you very much.

Yours sincerely,


TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage: http://tayweebeng.wixsite.com/website
Youtube research showcase: 
https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
linkedin: www.linkedin.com/in/tay-weebeng




--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/




Re: [petsc-users] Problem compiling with 64bit PETSc

2018-08-30 Thread TAY wee-beng

Hi Randy,

I was about to email why it didn't work even after changing them to 
PetscInt. Then I realise that ierr should be defined as PetscErrorCode.


Thank you very much.

Yours sincerely,


TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage: http://tayweebeng.wixsite.com/website
Youtube research showcase: 
https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
linkedin: www.linkedin.com/in/tay-weebeng


On 31/8/2018 11:36 AM, Randall Mackie wrote:

Don’t forget that not all integers passed into PETSc routines are 64 bit.
For example, the error codes when called from Fortran should be 
defined as PetscErrorCode and not PetscInt.
It’s really good to get into the habit of correctly declaring all 
PETSc variables according to the web pages, so that you can easily 
switch between 32bit and 64bit integers.


Randy


On Aug 30, 2018, at 7:43 PM, Smith, Barry F. > wrote:




On Aug 30, 2018, at 9:40 PM, TAY wee-beng > wrote:



On 31/8/2018 10:38 AM, Smith, Barry F. wrote:

 PetscReal is by default real(8) you can leave those alone

  Any integer you pass to a PETSc routine needs to be declared as 
PetscInt (not integer) otherwise the 64 bit indices stuff won't work.


  Barry


Hi,

ok, I got it. Btw, is it advisable to change all integer in my code 
to PetscInt?


Will it cause any conflict or waste a lot of memory?

Or should I only change those related to PETSc?


   That is up to you. Since you probably pass the values between 
PETSc and non-PETSc part of the code it is probably easier just to 
make all the integer PetscInt instead. No performance difference that 
you can measure by keeping a few integer around.


   Barry



Thanks!
On Aug 30, 2018, at 9:35 PM, TAY wee-beng > wrote:



On 31/8/2018 10:21 AM, Matthew Knepley wrote:
On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng > wrote:

Hi,

Due to my increase grid size, I have to go 64bit. I compiled the 
64bit

PETSc w/o error. However, when I tried to compile my code using the
64bit PETSc, I got the error below. May I know why is this so?

What changes should I make?

Is it possible that you did not declare some inputs as PetscInt, 
so the interface check is failing?


  Matt

Hi,

I'm using the standard

integer ::

real(8) ::

for some variables. For some others relating to PETSc, I use PetscInt.

Should I change all to PetscInt and PetscReal?

Currently, I use real(8) for all real values. If I change all to 
PetscReal, will PetscReal be real or real(8) by default?


Thanks!


[tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
-g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
-g -ip -ipo -O3 -c -fPIC  -save  -w
-I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
-I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
global.F90
global.F90(979): error #6285: There is no matching specific 
subroutine

for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(989): error #6285: There is no matching specific 
subroutine

for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(997): error #6285: There is no matching specific 
subroutine

for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1005): error #6285: There is no matching specific 
subroutine

for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1013): error #6285: There is no matching specific 
subroutine

for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1021): error #6285: There is no matching specific 
subroutine

for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1029): error #6285: There is no matching specific 
subroutine

for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
compilation aborted for global.F90 (code 1)

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-08-30 Thread Randall Mackie
Don’t forget that not all integers passed into PETSc routines are 64 bit.
For example, the error codes when called from Fortran should be defined as 
PetscErrorCode and not PetscInt.
It’s really good to get into the habit of correctly declaring all PETSc 
variables according to the web pages, so that you can easily switch between 
32bit and 64bit integers.

Randy


> On Aug 30, 2018, at 7:43 PM, Smith, Barry F.  wrote:
> 
> 
> 
>> On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:
>> 
>> 
>> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
>>>  PetscReal is by default real(8) you can leave those alone
>>> 
>>>   Any integer you pass to a PETSc routine needs to be declared as PetscInt 
>>> (not integer) otherwise the 64 bit indices stuff won't work.
>>> 
>>>   Barry
>>> 
>> Hi,
>> 
>> ok, I got it. Btw, is it advisable to change all integer in my code to 
>> PetscInt?
>> 
>> Will it cause any conflict or waste a lot of memory?
>> 
>> Or should I only change those related to PETSc?
> 
>That is up to you. Since you probably pass the values between PETSc and 
> non-PETSc part of the code it is probably easier just to make all the integer 
> PetscInt instead. No performance difference that you can measure by keeping a 
> few integer around.
> 
>Barry
> 
>> 
>> Thanks!
 On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:
 
 
 On 31/8/2018 10:21 AM, Matthew Knepley wrote:
> On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  wrote:
> Hi,
> 
> Due to my increase grid size, I have to go 64bit. I compiled the 64bit
> PETSc w/o error. However, when I tried to compile my code using the
> 64bit PETSc, I got the error below. May I know why is this so?
> 
> What changes should I make?
> 
> Is it possible that you did not declare some inputs as PetscInt, so the 
> interface check is failing?
> 
>   Matt
 Hi,
 
 I'm using the standard
 
 integer ::
 
 real(8) ::
 
 for some variables. For some others relating to PETSc, I use PetscInt.
 
 Should I change all to PetscInt and PetscReal?
 
 Currently, I use real(8) for all real values. If I change all to 
 PetscReal, will PetscReal be real or real(8) by default?
 
 Thanks!
> 
> [tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
> -g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
> -g -ip -ipo -O3 -c -fPIC  -save  -w
> -I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
> -I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
> global.F90
> global.F90(979): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
> call
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(989): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
> call
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(997): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
> call
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(1005): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
> call
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(1013): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
> call
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(1021): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
> call
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(1029): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
> call
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> compilation aborted for global.F90 (code 1)
> 
> -- 
> Thank you very much.
> 
> Yours sincerely,
> 
> 
> TAY Wee-Beng (Zheng

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-08-30 Thread Smith, Barry F.


> On Aug 30, 2018, at 9:40 PM, TAY wee-beng  wrote:
> 
> 
> On 31/8/2018 10:38 AM, Smith, Barry F. wrote:
>>   PetscReal is by default real(8) you can leave those alone
>> 
>>Any integer you pass to a PETSc routine needs to be declared as PetscInt 
>> (not integer) otherwise the 64 bit indices stuff won't work.
>> 
>>Barry
>> 
> Hi,
> 
> ok, I got it. Btw, is it advisable to change all integer in my code to 
> PetscInt?
> 
> Will it cause any conflict or waste a lot of memory?
> 
> Or should I only change those related to PETSc?

That is up to you. Since you probably pass the values between PETSc and 
non-PETSc part of the code it is probably easier just to make all the integer 
PetscInt instead. No performance difference that you can measure by keeping a 
few integer around.

Barry

> 
> Thanks!
>>> On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:
>>> 
>>> 
>>> On 31/8/2018 10:21 AM, Matthew Knepley wrote:
 On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  wrote:
 Hi,
 
 Due to my increase grid size, I have to go 64bit. I compiled the 64bit
 PETSc w/o error. However, when I tried to compile my code using the
 64bit PETSc, I got the error below. May I know why is this so?
 
 What changes should I make?
 
 Is it possible that you did not declare some inputs as PetscInt, so the 
 interface check is failing?
 
Matt
>>> Hi,
>>> 
>>> I'm using the standard
>>> 
>>> integer ::
>>> 
>>> real(8) ::
>>> 
>>> for some variables. For some others relating to PETSc, I use PetscInt.
>>> 
>>> Should I change all to PetscInt and PetscReal?
>>> 
>>> Currently, I use real(8) for all real values. If I change all to PetscReal, 
>>> will PetscReal be real or real(8) by default?
>>> 
>>> Thanks!
  
 [tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
 /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
 -g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
 /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
 -g -ip -ipo -O3 -c -fPIC  -save  -w
 -I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
 -I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
 global.F90
 global.F90(979): error #6285: There is no matching specific subroutine
 for this generic subroutine call.   [DMDACREATE3D]
 call
 DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
 -^
 global.F90(989): error #6285: There is no matching specific subroutine
 for this generic subroutine call.   [DMDACREATE3D]
  call
 DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
 -^
 global.F90(997): error #6285: There is no matching specific subroutine
 for this generic subroutine call.   [DMDACREATE3D]
  call
 DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
 -^
 global.F90(1005): error #6285: There is no matching specific subroutine
 for this generic subroutine call.   [DMDACREATE3D]
  call
 DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
 -^
 global.F90(1013): error #6285: There is no matching specific subroutine
 for this generic subroutine call.   [DMDACREATE3D]
  call
 DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
 -^
 global.F90(1021): error #6285: There is no matching specific subroutine
 for this generic subroutine call.   [DMDACREATE3D]
  call
 DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
 -^
 global.F90(1029): error #6285: There is no matching specific subroutine
 for this generic subroutine call.   [DMDACREATE3D]
  call
 DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
 -^
 compilation aborted for global.F90 (code 1)
 
 -- 
 Thank you very much.
 
 Yours sincerely,
 
 
 TAY Wee-Beng (Zheng Weiming) 郑伟明
 Personal research webpage: http://tayweebeng.wixsite.com/website
 Youtube research showcase: 
 https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
 linkedin: www.linkedin.com/in/tay-weebeng
 
 
 
 
 -- 
 What most experimenters take for granted before they begin their 
 experiments is infinitely more interesting than any results to which their 
 experiments lead.
 -- Norbert Wiener
 
 https://www.cse.buffalo.

Re: [petsc-users] Problem compiling with 64bit PETSc

2018-08-30 Thread TAY wee-beng



On 31/8/2018 10:38 AM, Smith, Barry F. wrote:

   PetscReal is by default real(8) you can leave those alone

Any integer you pass to a PETSc routine needs to be declared as PetscInt 
(not integer) otherwise the 64 bit indices stuff won't work.

Barry


Hi,

ok, I got it. Btw, is it advisable to change all integer in my code to 
PetscInt?


Will it cause any conflict or waste a lot of memory?

Or should I only change those related to PETSc?

Thanks!

On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:


On 31/8/2018 10:21 AM, Matthew Knepley wrote:

On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  wrote:
Hi,

Due to my increase grid size, I have to go 64bit. I compiled the 64bit
PETSc w/o error. However, when I tried to compile my code using the
64bit PETSc, I got the error below. May I know why is this so?

What changes should I make?

Is it possible that you did not declare some inputs as PetscInt, so the 
interface check is failing?

Matt

Hi,

I'm using the standard

integer ::

real(8) ::

for some variables. For some others relating to PETSc, I use PetscInt.

Should I change all to PetscInt and PetscReal?

Currently, I use real(8) for all real values. If I change all to PetscReal, 
will PetscReal be real or real(8) by default?

Thanks!
  


[tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
-g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
-g -ip -ipo -O3 -c -fPIC  -save  -w
-I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
-I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
global.F90
global.F90(979): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(989): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(997): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1005): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1013): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1021): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1029): error #6285: There is no matching specific subroutine
for this generic subroutine call.   [DMDACREATE3D]
  call
DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
compilation aborted for global.F90 (code 1)

--
Thank you very much.

Yours sincerely,


TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage: http://tayweebeng.wixsite.com/website
Youtube research showcase: 
https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
linkedin: www.linkedin.com/in/tay-weebeng




--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/




Re: [petsc-users] Problem compiling with 64bit PETSc

2018-08-30 Thread Smith, Barry F.

  PetscReal is by default real(8) you can leave those alone

   Any integer you pass to a PETSc routine needs to be declared as PetscInt 
(not integer) otherwise the 64 bit indices stuff won't work. 

   Barry


> On Aug 30, 2018, at 9:35 PM, TAY wee-beng  wrote:
> 
> 
> On 31/8/2018 10:21 AM, Matthew Knepley wrote:
>> On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  wrote:
>> Hi,
>> 
>> Due to my increase grid size, I have to go 64bit. I compiled the 64bit 
>> PETSc w/o error. However, when I tried to compile my code using the 
>> 64bit PETSc, I got the error below. May I know why is this so?
>> 
>> What changes should I make?
>> 
>> Is it possible that you did not declare some inputs as PetscInt, so the 
>> interface check is failing?
>> 
>>Matt
> Hi,
> 
> I'm using the standard 
> 
> integer ::
> 
> real(8) ::
> 
> for some variables. For some others relating to PETSc, I use PetscInt.
> 
> Should I change all to PetscInt and PetscReal?
> 
> Currently, I use real(8) for all real values. If I change all to PetscReal, 
> will PetscReal be real or real(8) by default?
> 
> Thanks!
>>  
>> 
>> [tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
>> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
>>  
>> -g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
>> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
>>  
>> -g -ip -ipo -O3 -c -fPIC  -save  -w 
>> -I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
>>  
>> -I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
>>  
>> global.F90
>> global.F90(979): error #6285: There is no matching specific subroutine 
>> for this generic subroutine call.   [DMDACREATE3D]
>> call 
>> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>> -^
>> global.F90(989): error #6285: There is no matching specific subroutine 
>> for this generic subroutine call.   [DMDACREATE3D]
>>  call 
>> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>> -^
>> global.F90(997): error #6285: There is no matching specific subroutine 
>> for this generic subroutine call.   [DMDACREATE3D]
>>  call 
>> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>> -^
>> global.F90(1005): error #6285: There is no matching specific subroutine 
>> for this generic subroutine call.   [DMDACREATE3D]
>>  call 
>> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>> -^
>> global.F90(1013): error #6285: There is no matching specific subroutine 
>> for this generic subroutine call.   [DMDACREATE3D]
>>  call 
>> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>> -^
>> global.F90(1021): error #6285: There is no matching specific subroutine 
>> for this generic subroutine call.   [DMDACREATE3D]
>>  call 
>> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>> -^
>> global.F90(1029): error #6285: There is no matching specific subroutine 
>> for this generic subroutine call.   [DMDACREATE3D]
>>  call 
>> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
>> -^
>> compilation aborted for global.F90 (code 1)
>> 
>> -- 
>> Thank you very much.
>> 
>> Yours sincerely,
>> 
>> 
>> TAY Wee-Beng (Zheng Weiming) 郑伟明
>> Personal research webpage: http://tayweebeng.wixsite.com/website
>> Youtube research showcase: 
>> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
>> linkedin: www.linkedin.com/in/tay-weebeng
>> 
>> 
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments 
>> is infinitely more interesting than any results to which their experiments 
>> lead.
>> -- Norbert Wiener
>> 
>> https://www.cse.buffalo.edu/~knepley/
> 



Re: [petsc-users] Problem compiling with 64bit PETSc

2018-08-30 Thread TAY wee-beng


On 31/8/2018 10:21 AM, Matthew Knepley wrote:
On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng > wrote:


Hi,

Due to my increase grid size, I have to go 64bit. I compiled the
64bit
PETSc w/o error. However, when I tried to compile my code using the
64bit PETSc, I got the error below. May I know why is this so?

What changes should I make?


Is it possible that you did not declare some inputs as PetscInt, so 
the interface check is failing?


   Matt

Hi,

I'm using the standard

integer ::

real(8) ::

for some variables. For some others relating to PETSc, I use PetscInt.

Should I change all to PetscInt and PetscReal?

Currently, I use real(8) for all real values. If I change all to 
PetscReal, will PetscReal be real or real(8) by default?


Thanks!



[tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018

/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90

-g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90

/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90

-g -ip -ipo -O3 -c -fPIC  -save  -w

-I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include


-I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include

global.F90
global.F90(979): error #6285: There is no matching specific
subroutine
for this generic subroutine call.   [DMDACREATE3D]
call

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(989): error #6285: There is no matching specific
subroutine
for this generic subroutine call.   [DMDACREATE3D]
 call

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(997): error #6285: There is no matching specific
subroutine
for this generic subroutine call.   [DMDACREATE3D]
 call

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1005): error #6285: There is no matching specific
subroutine
for this generic subroutine call.   [DMDACREATE3D]
 call

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1013): error #6285: There is no matching specific
subroutine
for this generic subroutine call.   [DMDACREATE3D]
 call

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1021): error #6285: There is no matching specific
subroutine
for this generic subroutine call.   [DMDACREATE3D]
 call

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
global.F90(1029): error #6285: There is no matching specific
subroutine
for this generic subroutine call.   [DMDACREATE3D]
 call

DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
-^
compilation aborted for global.F90 (code 1)

-- 
Thank you very much.


Yours sincerely,


TAY Wee-Beng (Zheng Weiming) 郑伟明
Personal research webpage: http://tayweebeng.wixsite.com/website
Youtube research showcase:
https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
linkedin: www.linkedin.com/in/tay-weebeng





--
What most experimenters take for granted before they begin their 
experiments is infinitely more interesting than any results to which 
their experiments lead.

-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 





Re: [petsc-users] Problem compiling with 64bit PETSc

2018-08-30 Thread Matthew Knepley
On Thu, Aug 30, 2018 at 10:17 PM TAY wee-beng  wrote:

> Hi,
>
> Due to my increase grid size, I have to go 64bit. I compiled the 64bit
> PETSc w/o error. However, when I tried to compile my code using the
> 64bit PETSc, I got the error below. May I know why is this so?
>
> What changes should I make?
>

Is it possible that you did not declare some inputs as PetscInt, so the
interface check is failing?

   Matt


>
> [tsltaywb@nus02 ibm3d_IIB_mpi]$ make -f makefile_2018
> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
>
> -g -ip -ipo -O3 -c -fPIC  -save kinefunc.F90
> /app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin/mpif90
>
> -g -ip -ipo -O3 -c -fPIC  -save  -w
> -I/home/users/nus/tsltaywb/propeller/lib/petsc-3.9.3_intel_2018_64bit_rel/include
>
> -I/app/intel/xe2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/include
>
> global.F90
> global.F90(979): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
> call
>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(989): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
>  call
>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(997): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
>  call
>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(1005): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
>  call
>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(1013): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
>  call
>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(1021): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
>  call
>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> global.F90(1029): error #6285: There is no matching specific subroutine
> for this generic subroutine call.   [DMDACREATE3D]
>  call
>
> DMDACreate3d(MPI_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,DMDA_STENCIL_STAR,size_x,size_y,&
> -^
> compilation aborted for global.F90 (code 1)
>
> --
> Thank you very much.
>
> Yours sincerely,
>
> 
> TAY Wee-Beng (Zheng Weiming) 郑伟明
> Personal research webpage: http://tayweebeng.wixsite.com/website
> Youtube research showcase:
> https://www.youtube.com/channel/UC72ZHtvQNMpNs2uRTSToiLA
> linkedin: www.linkedin.com/in/tay-weebeng
> 
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/