On Mon, Jun 25, 2012 at 1:08 PM, Simon Urbanek
wrote:
>
> On Jun 25, 2012, at 11:57 AM, andre zege wrote:
>
> >
> >
> > On Mon, Jun 25, 2012 at 11:17 AM, Simon Urbanek <
> simon.urba...@r-project.org> wrote:
> >
> > On Jun 25, 2012, at 10:20 AM, a
On Mon, Jun 25, 2012 at 11:17 AM, Simon Urbanek wrote:
>
> On Jun 25, 2012, at 10:20 AM, andre zege wrote:
>
> > dput() is intended to be parsed by R so the above is not possible
> without massaging the output. But why in the would would you use dput() for
> something tha
>
> dput() is intended to be parsed by R so the above is not possible without
> massaging the output. But why in the would would you use dput() for
> something that you want to read in Java? Why don't you use a format that
> Java can read easily - such as JSON?
>
> Cheers,
> Simon
>
>
>
>
>
Yeap, e
I am reading into Java dput output for a matrix, more specifically for a
file backed big-matrix. I basically need to lift dimnames for a matrix from
dput output. It's no big deal, but the code is very 'hackish' due to the
need to get rid of quotes, endlines, parenthesis, etc. I was wondering if i
c
>
> bigmemory matrices are simply arrays of native types (typically doubles,
> but bm supports other types, too) so they are trivially readable/writable
> from both C++ (just read into memory and cast to the array type) and Java
> (e.g, DoubleBuffer view on a ByteBuffer). So the question is what ex
I work with problems that have rather large data requirements -- typically
a bunch of multigig arrays. Given how generous R is with using memory, the
only way for me to work with R has been to use bigmatrices from bigmemory
package. One thing that is missing a bit is interoperability of bigmatrices
I am unable to compile R-2.15.0 source. I configured it without problems
with options that i used many times before
./configure --prefix=/home/andre/R-2.15.0
--enable-byte-compiled-packages=no --with-tcltk --enable-R-shlib=yes
Then when i started making it, it died while making lapack, particular
t;- load(file.path(dump.dir, files[j]))
> mat.data[[j]]<-data;
> # Not needed anymore/remove everything loaded
> rm(list=vars);
> }
>
> data <- abind(mat.data, along=2);
> # Not needed anymore
> rm(mat.data);
>
> save(data, file.path(dump.dir, filename))
&g
, 2012, at 00:53 , andre zege wrote:
>
> > I recently started using R 2.14.0 on a new machine and i am experiencing
> > what seems like unusually greedy memory use. It happens all the time, but
> > to give a specific example, let's say i run the following code
> &
I recently started using R 2.14.0 on a new machine and i am experiencing
what seems like unusually greedy memory use. It happens all the time, but
to give a specific example, let's say i run the following code
for(j in 1:length(files)){
load(file.path(dump.dir, files[j]))
ma
Hi, guys. I posted this by accident at rcpp-dev, although it meant to
be only to r-dev, so don't flame me here please, rcpp guys will
do it there, i am sure :).
I have some pretty large arrays in R and i wanted to do some time
consuming modifications of these arrays in C++ without actually copying
Hi all. I started looking at Rcpp, which looks pretty great, actually. At the
moment just trying to compile a module to get a feel how it all works without
fully understanding how all the pieces fit together.
Basically, i took the first example from Rcpp modules vignette:
fun.cpp
==
Hi all, i am trying to compile a test, calling from C code R Lapack shared
libraries. In particular, i am calling simple LAPACK driver
dposv for solving linear equation system A*x=B with positive definite A. My
code looks like the following in
solve.c
==
#includ
13 matches
Mail list logo