On Thu, Mar 15, 2018 at 8:18 AM, Manuel Valera
wrote:
> Ok so, i went back and erased the old libpetsc.so.3 i think it was the one
> causing problems, i had --with-shared-libraries=0 and the installation
> complained of not having that file, then reinstalled with
> --with-shared-libraries=1 and i
Ok so, i went back and erased the old libpetsc.so.3 i think it was the one
causing problems, i had --with-shared-libraries=0 and the installation
complained of not having that file, then reinstalled with
--with-shared-libraries=1 and it is finally recognizing my system
installation with only CUDA,
On Thu, Mar 15, 2018 at 4:01 AM, Manuel Valera
wrote:
> Ok well, it turns out the $PETSC_DIR points to the testpetsc directory,
> and it makes, install and tests without problems (only a problem on ex5f)
> but trying to reconfigure on valera/petsc directory asks me to change the
> $PETSC_DIR vari
Ok well, it turns out the $PETSC_DIR points to the testpetsc directory, and
it makes, install and tests without problems (only a problem on ex5f) but
trying to reconfigure on valera/petsc directory asks me to change the
$PETSC_DIR variable,
Meanwhile the system installation still points to the val
On Thu, Mar 15, 2018 at 3:25 AM, Manuel Valera
wrote:
> yeah that worked,
>
> [valera@node50 tutorials]$ ./ex19 -dm_vec_type seqcuda -dm_mat_type
> seqaijcusparse
> lid velocity = 0.0625, prandtl # = 1., grashof # = 1.
> Number of SNES iterations = 2
> [valera@node50 tutorials]$
>
> How do i make
yeah that worked,
[valera@node50 tutorials]$ ./ex19 -dm_vec_type seqcuda -dm_mat_type
seqaijcusparse
lid velocity = 0.0625, prandtl # = 1., grashof # = 1.
Number of SNES iterations = 2
[valera@node50 tutorials]$
How do i make sure the other program refer to this installation? using the
same argum
On Thu, Mar 15, 2018 at 3:19 AM, Manuel Valera
wrote:
> Yes, this is the system installation that is being correctly linked (the
> linear solver and model are not linking the correct installation idk why
> yet) i configured with only CUDA this time because of the message Karl Rupp
> posted on my
Yes, this is the system installation that is being correctly linked (the
linear solver and model are not linking the correct installation idk why
yet) i configured with only CUDA this time because of the message Karl Rupp
posted on my installation thread, where he says only one type of library
will
On Thu, Mar 15, 2018 at 3:12 AM, Manuel Valera
wrote:
> Thanks, got this error:
>
Did you not configure with CUSP? It looks like you have CUDA, so use
-dm_vec_type seqcuda
Thanks,
Matt
> [valera@node50 testpetsc]$ cd src/snes/examples/tutorials/
> [valera@node50 tutorials]$ PETSC_AR
Thanks, got this error:
[valera@node50 testpetsc]$ cd src/snes/examples/tutorials/
[valera@node50 tutorials]$ PETSC_ARCH="" make ex19
/usr/lib64/openmpi/bin/mpicc -o ex19.o -c -Wall -Wwrite-strings
-Wno-strict-aliasing -Wno-unknown-pragmas -fstack-protector
-fvisibility=hidden -O2 -I/home/valera
On Thu, Mar 15, 2018 at 2:46 AM, Manuel Valera
wrote:
> Ok lets try that, if i go to /home/valera/testpetsc/
> arch-linux2-c-opt/tests/src/snes/examples/tutorials there is runex19.sh
> and a lot of other ex19 variantes, but if i run that i get:
>
knepley/feature-plex-functionals *$:/PETSc3/petsc
Ok lets try that, if i go
to /home/valera/testpetsc/arch-linux2-c-opt/tests/src/snes/examples/tutorials
there is runex19.sh and a lot of other ex19 variantes, but if i run that i
get:
[valera@node50 tutorials]$ ./runex19.sh
not ok snes_tutorials-ex19_1
# ---
On Thu, Mar 15, 2018 at 2:27 AM, Manuel Valera
wrote:
> Ok thanks Matt, i made a smaller case with only the linear solver and a
> 25x25 matrix, the error i have in this case is:
>
Ah, it appears that not all parts of your problem are taking the type
options. If you want the
linear algebra object
Ok thanks Matt, i made a smaller case with only the linear solver and a
25x25 matrix, the error i have in this case is:
[valera@node50 alone]$ mpirun -n 1 ./linsolve -vec_type cusp -mat_type
aijcusparse
laplacian.petsc !
TrivSoln loaded, size: 125 / 125
RHS loaded, size:
On Fri, Mar 9, 2018 at 3:05 AM, Manuel Valera
wrote:
> Hello all,
>
> I am working on porting a linear solver into GPUs for timing purposes, so
> far i've been able to compile and run the CUSP libraries and compile PETSc
> to be used with CUSP and ViennaCL, after the initial runs i noticed some
>
Hello all,
I am working on porting a linear solver into GPUs for timing purposes, so
far i've been able to compile and run the CUSP libraries and compile PETSc
to be used with CUSP and ViennaCL, after the initial runs i noticed some
errors, they are different for different flags and i would apprec
16 matches
Mail list logo