Re: [R] C: drive memory full

2014-06-18 Thread Hiyoshi, Ayako
Dear R users,

Thank you so much for your help. 
I got deleted all files in my C:\xxx\AppData\Local\Temp file and also run 
suggested R codes to erase files. 
I list them below as somebody may be interested in.

All in all, I got some space (1 GB) back, and compressing Stata data, I 
achieved what I needed to do with Stata. 

Also, Windows tips for finding hidden files and big size files were very 
helpful. Unfortunately I could not run SequoiaView because of the security 
issue of my PC at office, but it sounded very useful.  

Thank you so much for considering my problem.

Best wishes,
Ayako 

--- some R codes I used : 

tempdir()
dirname(tempdir())
unlink(tempdir(),recursive=TRUE)
rm(list=ls(all=TRUE))


From: r-help-boun...@r-project.org r-help-boun...@r-project.org on behalf of 
jim holtman jholt...@gmail.com
Sent: 18 June 2014 15:14
To: David L Carlson
Cc: r-help@r-project.org
Subject: Re: [R] C: drive memory full

One program that I have found valuable in looking at the size of files on
disk is SequoiaView.  It creates a 'treemap' of the sizes of the files and
lets you zoom in to specific directories and such.


Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.


On Wed, Jun 18, 2014 at 10:05 AM, David L Carlson dcarl...@tamu.edu wrote:

 Windows 8 (and earlier versions) has a disk cleanup tool that will let you
 delete temporary files that might help. At the bottom of the screen on the
 taskbar, click the folder icon to bring up a windows showing your drives
 and folders. On the left side click on the drive you are interested in
 (probably C:). On the top title bar of the window showing the folders in
 drive C: on the left is an icon showing a sheet of paper with a red
 checkmark. That brings up a properties window that shows you how much space
 you are using and how much is free. Close that window and then click on the
 Manage tab on the line just below the title bar. Click on Cleanup and that
 will let you select temporary files of various kinds for deletion if your
 drive is getting full.

 -
 David L Carlson
 Department of Anthropology
 Texas AM University
 College Station, TX 77840-4352

 -Original Message-
 From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
 On Behalf Of jwd
 Sent: Tuesday, June 17, 2014 8:10 PM
 To: r-help@r-project.org
 Subject: Re: [R] C: drive memory full

 On Tue, 17 Jun 2014 12:48:54 +
 Hiyoshi, Ayako ayako.hiyoshi...@alumni.ucl.ac.uk wrote:

  Dear Martyn and Professor Ripley,
 
  Thank you so much for your help. I used Window's large file search
  (it was useful! thank you), but there is no big files detected in C:
  drive . Perhaps I will have to reinstall Windows..
 
 The problem may not be R at all, but rather Windows.  How large is your
 drive, not RAM but the actual drive?

 If you can, examine your drive to see if there are either fragmentation
 problems or hidden files taking up excess space.  Look at the
 drive properties for C:\ and run disk clean-up if possible.  Another
 problem, MS may have fixed this at some point - I don't run Win 8, is
 that if your deleted files are not removed, you can run out of space.
 Usually, the space occupied by a file marked for deletion should
 be scavenged as Windows needs space, but sometimes that may not happen,
 especially with a very full drive. So, if there are files in the
 Wastebasket remove them completely.  Another space-use that can grow
 to excess is a print spool. Even Windows uses a print spool and that can
 grow too large for the system to function efficiently.

 JWdougherty

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] C: drive memory full

2014-06-17 Thread Hiyoshi, Ayako
Dear R users,



Hello, I am new to R and confused about my PC's memory space after using R a 
while.



My PC is Windows 8, RAM 8GB. I have run some analyses using commands like 
vglm, aalen(Sruv()), lm()... some regressions.



I also use Stata, and when I tried to run Stata (big file), Stata could not do 
something which used to do because of the lack of memory space. I suspect R's 
something because R is the only new activity I did recently.



I tried to google and found 'tempdir()'.

I checked my temporary file but it was empty.

Just in case, after running 'unlink(tempdir(),recursive=TRUE)', I restarted my 
computer, but memory space did not change. But there seems still something big 
in my C: drive storage and nearly 12GB is eaten.



Could it be possible that R saved something somewhere?

As I finished analyses, all I need is to erase everything stored so that I can 
get my memory space back.



Thank you so much for your help.



Best wishes,

Ayako







[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] C: drive memory full

2014-06-17 Thread Hiyoshi, Ayako
Dear Professor Ripley,

Thank you so much for your quick reply.

I tried 'dianame(tempdir())' and removed several 'Rtemp' and all other files. 
Unfortunately it did not changed C: drive space much. 

I am really sorry, but could there be other things stored in somewhere in C: 
drive ?

I called IT support, but they could not spot either.

Kind regards,
Ayako


From: Prof Brian Ripley rip...@stats.ox.ac.uk
Sent: 17 June 2014 10:37
To: Hiyoshi, Ayako; r-help@R-project.org
Subject: Re: [R] C: drive memory full

tempdir() is specific to an R session.

Start up R and run

dirname(tempdir())

and look at that directory.  Shut down R, then remove all old
files/folders in that directory, especially those beginning with 'Rtmp'.

An R process tries to clean up after itself but

- it cannot do so if it segfaults 
- Windows has rules which no other OS has which can result in files
being locked and hence not deletable.


On 17/06/2014 08:49, Hiyoshi, Ayako wrote:
 Dear R users,



 Hello, I am new to R and confused about my PC's memory space after using R a 
 while.



 My PC is Windows 8, RAM 8GB. I have run some analyses using commands like 
 vglm, aalen(Sruv()), lm()... some regressions.



 I also use Stata, and when I tried to run Stata (big file), Stata could not 
 do something which used to do because of the lack of memory space. I suspect 
 R's something because R is the only new activity I did recently.



 I tried to google and found 'tempdir()'.

 I checked my temporary file but it was empty.

 Just in case, after running 'unlink(tempdir(),recursive=TRUE)', I restarted 
 my computer, but memory space did not change. But there seems still something 
 big in my C: drive storage and nearly 12GB is eaten.



 Could it be possible that R saved something somewhere?

 As I finished analyses, all I need is to erase everything stored so that I 
 can get my memory space back.



 Thank you so much for your help.



 Best wishes,

 Ayako







   [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] C: drive memory full

2014-06-17 Thread Hiyoshi, Ayako
Dear Martyn and Professor Ripley,

Thank you so much for your help. I used Window's large file search (it was 
useful! thank you), but there is no big files detected in C: drive .
Perhaps I will have to reinstall Windows..

Thank you so much for your replies.

Best wishes,
Ayako

From: Martyn Byng martyn.b...@nag.co.uk
Sent: 17 June 2014 12:10
To: Hiyoshi, Ayako; Prof Brian Ripley; r-help@R-project.org
Subject: RE: [R] C: drive memory full

Hi,

Try

http://social.technet.microsoft.com/wiki/contents/articles/19295.windows-8-how-to-search-for-large-files.aspx

Martyn

-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
Behalf Of Hiyoshi, Ayako
Sent: 17 June 2014 11:40
To: Prof Brian Ripley; r-help@R-project.org
Subject: Re: [R] C: drive memory full

Dear Professor Ripley,

Thank you so much for your quick reply.

I tried 'dianame(tempdir())' and removed several 'Rtemp' and all other files. 
Unfortunately it did not changed C: drive space much.

I am really sorry, but could there be other things stored in somewhere in C: 
drive ?

I called IT support, but they could not spot either.

Kind regards,
Ayako


From: Prof Brian Ripley rip...@stats.ox.ac.uk
Sent: 17 June 2014 10:37
To: Hiyoshi, Ayako; r-help@R-project.org
Subject: Re: [R] C: drive memory full

tempdir() is specific to an R session.

Start up R and run

dirname(tempdir())

and look at that directory.  Shut down R, then remove all old files/folders in 
that directory, especially those beginning with 'Rtmp'.

An R process tries to clean up after itself but

- it cannot do so if it segfaults 
- Windows has rules which no other OS has which can result in files being 
locked and hence not deletable.


On 17/06/2014 08:49, Hiyoshi, Ayako wrote:
 Dear R users,



 Hello, I am new to R and confused about my PC's memory space after using R a 
 while.



 My PC is Windows 8, RAM 8GB. I have run some analyses using commands like 
 vglm, aalen(Sruv()), lm()... some regressions.



 I also use Stata, and when I tried to run Stata (big file), Stata could not 
 do something which used to do because of the lack of memory space. I suspect 
 R's something because R is the only new activity I did recently.



 I tried to google and found 'tempdir()'.

 I checked my temporary file but it was empty.

 Just in case, after running 'unlink(tempdir(),recursive=TRUE)', I restarted 
 my computer, but memory space did not change. But there seems still something 
 big in my C: drive storage and nearly 12GB is eaten.



 Could it be possible that R saved something somewhere?

 As I finished analyses, all I need is to erase everything stored so that I 
 can get my memory space back.



 Thank you so much for your help.



 Best wishes,

 Ayako







   [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


This e-mail has been scanned for all viruses by Star.
...{{dropped:4}}

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Difference in coefficients in Cox proportional hazard estimates between R and Stata, why?

2014-05-30 Thread Hiyoshi, Ayako
Dear R users,



Hi, thank you so much for your help in advance.

I have been using Stata but new to R. For my paper revision using Aalen's 
survival analysis, I need to use R, as the command including Aalen's survival 
seems to be available in R (32-bit, version 3.1.0 (2014-04-10)) but less ready 
to be used in Stata (version 13/SE).



To make sure that I can do basics, I have fitted logistic regression and Cox 
proportional hazard regression using R and Stata.



The data I used were from UCLA R's textbook example page: 
http://www.ats.ucla.edu/stat/r/examples/asa/asa_ch1_r.htm. 
http://www.ats.ucla.edu/stat/r/examples/asa/asa_ch1_r.htm. I used this in Stata 
too.



When I fitted logistic regression as below, the estimates were exactly same 
between R and Stata.



Example using logistic regression

R:



logistic1 - glm(censor ~ age + drug, data=, family = binomial)

summary(logistic1)

exp(cbind(OR=coef(logistic1), confint(logistic1)))

   OR  2.5 %97.5 %
(Intercept) 1.0373731 0.06358296 16.797896
age 1.0436805 0.96801933  1.131233
drug0.7192149 0.26042635  1.937502



Stata:



logistic censor age i.drug
OR CI_lower CI_upper
age |   1.043681   .96623881.127329
drug |.719215   .26651941.940835
_cons |   1.037373   .065847 16.3431



However, when I fitted Cox proportional hazard regression, there were some 
discrepancies in coefficient (and exponentiated hazard ratios).



Example using Cox proportioanl hazard regression

R:



cox1 - coxph(Surv(time, censor) ~ drug, age, data=)
summary(cox1)

Call:
coxph(formula = Surv(time, censor) ~ drug + age, data = )
  n= 100, number of events= 80
coef exp(coef) se(coef) z Pr(|z|)
drug 1.01670   2.76405  0.25622 3.968 7.24e-05 ***
age  0.09714   1.10202  0.01864 5.211 1.87e-07 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
 exp(coef) exp(-coef) lower .95 upper .95
drug 2.764 0.3618 1.673 4.567
age  1.102 0.9074 1.062 1.143
Concordance= 0.711  (se = 0.042 )
Rsquare= 0.324   (max possible= 0.997 )
Likelihood ratio test= 39.13  on 2 df,   p=3.182e-09
Wald test= 36.13  on 2 df,   p=1.431e-08
Score (logrank) test = 38.39  on 2 df,   p=4.602e-09

Stata:

stset time, f(censor)
stcox drug age
--
  _t | Haz. Ratio   Std. Err.  zP|z| [95% Conf. Interval]
-+
drug |   2.563531   .6550089 3.68   0.000  1.553634.229893
 age |   1.095852 .02026 4.95   0.000 1.0568541.136289
--




The HR estimates for drug was 2.76 from R, but 2.56 from Stata.

I searched in internet for explanation, but could not find any.



In parametric survival regression with exponential distribution, R and Stata's 
coefficients were completely opposite while the values were exactly same (i.e. 
say 0.08 for Stata and -0.08 for R). I suspected something like this 
(http://www.theanalysisfactor.com/ordinal-logistic-regression-mystery/) going 
on, but for Cox proportional hazard regression, i coudl not find any resource 
helping me.



I highly appreciate if anyone could explain this for me, or suggest me resource 
that I can read.



Thank you so much for your help.



Best,

Ayako


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.