I am a casual user looking to learn about GPU acceleration in R... at a
"new to computers" level (do not use overly technical language with me as I
will not understand what you are saying).

I am on macOS Sierra with an Intel Iris Pro GPU.

1. Can someone produce any simple example in R that uses GPU processing for
my GPU?

2. Is there a way to use the GPU to do an iterative process, such as the
Euler method? This is the kind where, let's say

x <- 5
x <- sin(x) + 3 (or some other complicated recursion)

and repeat this until 10000 steps are reached.

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to