Re: [R] Exception while using NeweyWest function with doMC

2011-08-30 Thread David Winsemius


On Aug 30, 2011, at 11:29 AM, Simon Zehnder wrote:


Hi David,

thank you very much for your advice! I updated R and all my  
packages. Regrettably it doesn't work yet. But, I think, that the  
parallel processing (using 32bit) does improve time, especially when  
it comes to higher dimensions:


system.time(simuFunctionSeq(0.03, 0.015, 1, 5, 1000, 100,/Users/ 
simon/Documents/R/BigMTest))
  system.time(simuFunctionPar(0.03, 0.015, 1, 5, 1000, 100,/Users/ 
simon/Documents/R/BigMTest))

[1] Sequential Processing with N =  1000  and K =  100
  user  system elapsed
 5.157   0.086   5.587
[1] Parallel Processing with N =  1000  and K =  100
  user  system elapsed
 6.069   0.220   3.895

: system.time(simuFunctionSeq(0.03, 0.015, 1, 5, 1, 100,/Users/ 
simon/Documents/R/BigMTest))
  system.time(simuFunctionPar(0.03, 0.015, 1, 5, 1, 100,/Users/ 
simon/Documents/R/BigMTest))

[1] Sequential Processing with N =  1  and K =  100
  user  system elapsed
 8.129   0.689  12.747
[1] Parallel Processing with N =  1  and K =  100
  user  system elapsed
 8.387   0.772  12.005

: system.time(simuFunctionSeq(0.03, 0.015, 1, 5, 1, 1000,/ 
Users/simon/Documents/R/BigMTest))
  system.time(simuFunctionPar(0.03, 0.015, 1, 5, 1, 1000,/Users/ 
simon/Documents/R/BigMTest))

[1] Sequential Processing with N =  1  and K =  1000
  user  system elapsed
71.295   6.330 109.656
[1] Parallel Processing with N =  1  and K =  1000
  user  system elapsed
50.943   6.347  89.115

Or are the times negligible?


I would think that for most applications getting a gain of efficiency  
of 20% would be considered unworthy of the effort at setting up and  
maintaining. I suppose if a simulation ran for 18 hours in sequential  
mode and you would be happier if it were done in the morning after  
leaving overnight and  finding it had completed in 15 hours, it might  
be worth the effort.


What happens if I use a supercomputer with several cores and much  
more memory?


Or even a MacPro with 4 or 8 cores and 32-64 GB?. Generally you hope  
to see halving or quartering in times when you apply these techniques.


--
David.



Thanks again!

Simon



On Aug 29, 2011, at 6:59 PM, David Winsemius wrote:



On Aug 27, 2011, at 3:37 PM, Simon Zehnder wrote:


Dear R users,

I am using R right now for a simulation of a model that needs a  
lot of
memory. Therefore I use the *bigmemory* package and - to make it  
faster -

the *doMC* package. See my code posted on http://pastebin.com/dFRGdNrG

Now, if I use the foreach loop with the addon %do% (for sequential  
run) I

have no problems at all - only here and there some singularities in
regressor matrices which should be ok.
BUT if I run the loop on multiple cores I get very often a bad  
exception. I
have posted the exception on http://pastebin.com/eMWF4cu0 The  
exception
comes from the NeweyWest function loaded within the sandwich  
library.


I have no clue, what it want to say me and why it is so weirdly  
printed to
the terminal. I am used to receive here and there errorsbut  
the messages

never look like this.

Does anyone have a useful answer for me, where to look for the  
cause of this

weird error?

Here some additional information:

Hardware: MacBook Pro 2.66 GHz Intel Core Duo, 4 GB Memory 1067  
MHz DDR3

Software System: Mac Os X Lion 10.7.1 (11B26)
Software App: R64 version 2.11.1 run via Mac terminal


Using the R64 version in a 4GB environment will reduce the  
effective memory capacity since the larger pointers take up more  
space, and using parallel methods is unlikely to improve  
performance very much with only two cores. It also seems likely  
that there have been several bug fixes in the last couple of years  
since that version of R was released, so the package authors are  
unlikely to be very interested in segfault errors thrown by  
outdated software.



I hope someone has a good suggestion!


Update R. Don't use features that only reduce performance and make  
unstable a machine that has limited resources.


--

David Winsemius, MD
West Hartford, CT





David Winsemius, MD
West Hartford, CT

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Exception while using NeweyWest function with doMC

2011-08-30 Thread Simon Zehnder
Hi Jay,

first: thank u very much for your comments! U made some very important points 
clear. I tried immediately to write directly the sample function from 

trade-as.big.matrix(matrix(sample(c(1,-1), (N+1)*K, replace=TRUE),ncol=K), 
backingpath=backingpath, backingfile=trade.bin,descriptorfile=trade.desc)

into the big matrix:

trade-big.matrix(sample(c(1,-1), (10+1), replace=TRUE),nrow=(10+1), ncol=10, 
type=double,backingpath=/Users/simon/Documents/R/BigMTest/, 
backingfile=terminaltest.bin, descriptorfile=terminaltest.desc)

But I either get only 1s:

trade[,]  [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10]
 [1,]111111111 1
 [2,]111111111 1
 [3,]111111111 1
 [4,]111111111 1
 [5,]111111111 1
 [6,]111111111 1
 [7,]111111111 1
 [8,]111111111 1
 [9,]111111111 1
[10,]111111111 1
[11,]111111111 1

or only -1s:

trade[,]  [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10]
 [1,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
 [2,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
 [3,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
 [4,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
 [5,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
 [6,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
 [7,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
 [8,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
 [9,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
[10,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1
[11,]   -1   -1   -1   -1   -1   -1   -1   -1   -1-1

Is there another possibility? In addition I found under ?as.big.matrix, for the 
second example the usage of matrix() inside of as.big.matrix(). But if I 
understood u correctly: usage is possible but does not save memory? I used the 
big.memory package because I always got  - using simply matrix() - exceptions, 
telling me, that memory has reached its limits. After using big.memory all 
worked fine, BUT running http://pastebin.com/UxSkzrae and  
http://pastebin.com/MErGQsQd , I got this: http://pastebin.com/KrEncrSz. It 
seems, as there is a problem with memory allocation inside the underlying 
C-code, maybe a result from my matrix generation inside of the big matrix?

Any suggestions?

 

On Aug 29, 2011, at 6:24 PM, Jay Emerson wrote:

 Simon,
 
 Though we're please to see another use of bigmemory, it really isn't
 clear that it is gaining you
 anything in your example; anything like as.big.matrix(matrix(...))
 still consumes full RAM for both
 the inner matrix() and the new big.matrix -- is the filebacking really
 necessary.  It also doesn't
 appear that you are making use of shared memory, so I'm unsure what
 the gains are.  However,
 I don't have any particular insight as to the subsequent problem with
 NeweyWest (which doesn't
 seem to be using the big.matrix objects).
 
 Jay
 
 --
 Message: 32
 Date: Sat, 27 Aug 2011 21:37:55 +0200
 From: Simon Zehnder simon.zehn...@googlemail.com
 To: r-help@r-project.org
 Subject: [R] Exception while using NeweyWest function with doMC
 Message-ID:
   cagqvrp_gk+t0owbv1ste-y0zafmi9s_zwqrxyxugsui18ms...@mail.gmail.com
 Content-Type: text/plain
 
 Dear R users,
 
 I am using R right now for a simulation of a model that needs a lot of
 memory. Therefore I use the *bigmemory* package and - to make it faster -
 the *doMC* package. See my code posted on http://pastebin.com/dFRGdNrG
 
  snip 
 -
 
 -- 
 John W. Emerson (Jay)
 Associate Professor of Statistics
 Department of Statistics
 Yale University
 http://www.stat.yale.edu/~jay


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Exception while using NeweyWest function with doMC

2011-08-29 Thread Jay Emerson
Simon,

Though we're please to see another use of bigmemory, it really isn't
clear that it is gaining you
anything in your example; anything like as.big.matrix(matrix(...))
still consumes full RAM for both
the inner matrix() and the new big.matrix -- is the filebacking really
necessary.  It also doesn't
appear that you are making use of shared memory, so I'm unsure what
the gains are.  However,
I don't have any particular insight as to the subsequent problem with
NeweyWest (which doesn't
seem to be using the big.matrix objects).

Jay

--
Message: 32
Date: Sat, 27 Aug 2011 21:37:55 +0200
From: Simon Zehnder simon.zehn...@googlemail.com
To: r-help@r-project.org
Subject: [R] Exception while using NeweyWest function with doMC
Message-ID:
   cagqvrp_gk+t0owbv1ste-y0zafmi9s_zwqrxyxugsui18ms...@mail.gmail.com
Content-Type: text/plain

Dear R users,

I am using R right now for a simulation of a model that needs a lot of
memory. Therefore I use the *bigmemory* package and - to make it faster -
the *doMC* package. See my code posted on http://pastebin.com/dFRGdNrG

 snip 
-

-- 
John W. Emerson (Jay)
Associate Professor of Statistics
Department of Statistics
Yale University
http://www.stat.yale.edu/~jay

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Exception while using NeweyWest function with doMC

2011-08-29 Thread David Winsemius


On Aug 27, 2011, at 3:37 PM, Simon Zehnder wrote:


Dear R users,

I am using R right now for a simulation of a model that needs a lot of
memory. Therefore I use the *bigmemory* package and - to make it  
faster -

the *doMC* package. See my code posted on http://pastebin.com/dFRGdNrG

Now, if I use the foreach loop with the addon %do% (for sequential  
run) I

have no problems at all - only here and there some singularities in
regressor matrices which should be ok.
BUT if I run the loop on multiple cores I get very often a bad  
exception. I
have posted the exception on http://pastebin.com/eMWF4cu0 The  
exception

comes from the NeweyWest function loaded within the sandwich library.

I have no clue, what it want to say me and why it is so weirdly  
printed to
the terminal. I am used to receive here and there errorsbut the  
messages

never look like this.

Does anyone have a useful answer for me, where to look for the cause  
of this

weird error?

Here some additional information:

Hardware: MacBook Pro 2.66 GHz Intel Core Duo, 4 GB Memory 1067 MHz  
DDR3

Software System: Mac Os X Lion 10.7.1 (11B26)
Software App: R64 version 2.11.1 run via Mac terminal


Using the R64 version in a 4GB environment will reduce the effective  
memory capacity since the larger pointers take up more space, and  
using parallel methods is unlikely to improve performance very much  
with only two cores. It also seems likely that there have been several  
bug fixes in the last couple of years since that version of R was  
released, so the package authors are unlikely to be very interested in  
segfault errors thrown by outdated software.



I hope someone has a good suggestion!


Update R. Don't use features that only reduce performance and make  
unstable a machine that has limited resources.


--

David Winsemius, MD
West Hartford, CT

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.