Re: [R] Creating Optimization Constraints

2012-10-17 Thread boylangc
Dr. Varadhan:

   Perfect. Package Rsolnp is exactly what I was looking for.  Thank you very 
much.

r/
Greg


-Original Message-
>From: Ravi Varadhan 
>Sent: Oct 17, 2012 10:24 AM
>To: "'boyla...@earthlink.net'" 
>Cc: "r-help@r-project.org" 
>Subject: Re: [R] Creating Optimization Constraints
>
>You have nonlinear constraints on the parameters.  Take a look at constrained 
>optimization algorithms in packages "alabama" or "Rsolnp" that can handle 
>non-linearly constrained optimization problems.
>
>Ravi
>
>Ravi Varadhan, Ph.D.
>Assistant Professor
>The Center on Aging and Health
>Division of Geriatric Medicine & Gerontology
>Johns Hopkins University
>rvarad...@jhmi.edu
>410-502-2619
>
>
>   [[alternative HTML version deleted]]
>
>__
>R-help@r-project.org mailing list
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Creating Optimization Constraints

2012-10-16 Thread boylangc
Good afternoon,

  In the code below, I have a set of functions (m1,m2,m3,s1,s2, and s3) which 
represent response surface designs for the mean and variance for three response 
variables, followed by an objective function that uses the "Big M" method to 
minimize variance (that is, push s1, s2, and s3 as close to 0 as possible) and 
hit targets for each of the three means (which are 0, 10, and 100, 
respectively). The results are OK but s1 and s3 are negative.  I want to 
incorporate a constraint or set of constraints that requires s1, s2, and s3 to 
be >= 0.  I apologize if this is a "dumb" question and the answer may be 
staring me in the face, but after several hours of tinkering with min and max 
functions within the objective function and performing google searches for 
"adding constraints to optimization functions" or the like, I am at a loss.  I 
am also sure there is a much more elegant way to code what I have done and so 
apologize for any crudeness there.  

Thank you for any assistance.  Code follows:

#Define the Response Surface Designs
m1<-function(x) {
x1<-x[1]
x2<-x[2]
2.1754-0.2219*x1-0.1493*x2-0.1656*x1^2-0.2911*x2^2-0.0862*x1*x2}

m2<-function(x) {
x1<-x[1]
x2<-x[2]
10.0005+0.0465*x1+0.0492*x2-0.0139*x1^2-0.0050*x2^2-0.0325*x1*x2}

m3<-function(x) {
x1<-x[1]
x2<-x[2]
95.1074+0.5288*x1+0.6521*x2-0.1746*x1^2-0.1357*x2^2-0.0712*x1*x2}

s1<-function(x) {
x1<-x[1]
x2<-x[2]
0.0311+0.*x1+0.00032*x2-0.01226*x1^2-0.01209*x2^2-0.00075*x1*x2}

s2<-function(x) {
x1<-x[1]
x2<-x[2]
0.003588-0.00022*x1-0.001967*x2+0.001482*x1^2+0.000245*x2^2+0.001375*x1*x2}

s3<-function(x) {
x1<-x[1]
x2<-x[2]
0.17789+0.00683*x1+0.006478*x2-0.07143*x1^2-0.06860*x2^2+0.01338*x1*x2}

# Define the "Big M"
M <- 10

#Defining the Objective Function
objective1<-function(x) {
  x1<-x[1]
  x2<-x[2]
  M*(s1(c(x1,x2)))+M*(s2(c(x1,x2))) + M*(s3(c(x1,x2))) + 
(1/3)*m1(c(x1,x2)) + (1/3)*abs(m2(c(x1,x2))-10) + (1/3)*(100-m3(c(x1,x2)))}

#Optimization
result1 <- nlminb(start=c(-0.3976,1.5541), objective1, gradient = NULL, hessian 
= NULL, lower = c(-1.682,-1.682), upper = c(1.682,1.682))

result1$objective

m1(c(result1$par))
m2(c(result1$par))
m3(c(result1$par))
s1(c(result1$par))
s2(c(result1$par))
s3(c(result1$par))


Thanks for any help,
Greg

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Plotting skewed normal distribution with a bar plot

2011-11-04 Thread boylangc
Steve,

   I don't profess to be an expert in R (by ANY means), but I've used the 
sn.mle function in package sn to do something similar. So, for example, if you 
have data (y), you can do the following:

y = c(5,4,6,4,5,6,7,7,55,64,23,13,1,4,1,14,1,15,11,12)
sn.mle(y=y)

I just made up this stream of data, so please forgive if it doesn't make sense. 
 The function will produce a number of things, including the mles for the mean, 
sd, and skew, as well as a histogram with the density overlayed onto it.

Maybe this helps? Hope so.

Greg

-Original Message-
>From: "R. Michael Weylandt" 
>Sent: Nov 3, 2011 10:39 PM
>To: steve_fried...@nps.gov
>Cc: r-help@r-project.org
>Subject: Re: [R] Plotting skewed normal distribution with a bar plot
>
>It seems like you'll need to apply some sort of MLE to estimate the
>parameters directly from the data before using dsn() to get the
>density. This might help with some of it:
>http://help.rmetrics.org/fGarch/html/snorm.html
>
>Michael
>
>On Thu, Nov 3, 2011 at 2:54 PM,   wrote:
>>
>> Hi,
>>
>> I need to create a plot (type = "h")  and then overlay a skewed-normal
>> curve on this distribution, but I'm not finding a procedure to accomplish
>> this. I want to use the plot function here in order to control the bin
>> distributions.
>>
>> I have explored the sn library and found the dsn function.  dsn uses known
>> location, scaling and shape parameters associated with a given input vector
>> of probabilities.  However, how can I calculate the skewed-normal curve if
>> I don't know these parameters in advance?
>>
>> Is there another function to calculate the skew-normal, perhaps in a
>> different package?
>>
>>
>> I'm working with R 2.13.2 on a windows based machine.
>>
>> Steve Friedman Ph. D.
>> Ecologist  / Spatial Statistical Analyst
>> Everglades and Dry Tortugas National Park
>> 950 N Krome Ave (3rd Floor)
>> Homestead, Florida 33034
>>
>> steve_fried...@nps.gov
>> Office (305) 224 - 4282
>> Fax     (305) 224 - 4147
>>
>> __
>> R-help@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>
>__
>R-help@r-project.org mailing list
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Using MLE Method to Estimate Regression Coefficients

2011-06-14 Thread boylangc
Good Afternoon,

  I am relatively new to R and have been trying to figure out how to estimate 
regression coefficients using the MLE method.  Some background: I am trying to 
examine scenarios in which certain estimators might be preferred to others, 
starting with MLE.  I understand that MLE will (should) produce the same 
results as Ordinary Least Squares if the assumption of normality holds. That 
said, in the following code (my apologies up front for any lack of elegance) I 
use the data from the printing press study (commonly used in the QE and stats 
literature) to develop first and second order models using OLS.  Then, using 
some code I found online, I tried to use MLE to do the same thing. However, I 
cannot get it to work, as I get an error in my attempt to use the optim 
function.  I have been studying the optim function in R; I have also explored 
the use of MLE in the R documentation via the stats4, MASS, and a few other 
packages but to little avail.  My questions are as follows:

1) Is there a particular error in the MLE code below that I am just not seeing?
2) Is there a simpler, more direct, or otherwise better way to approach it?
3) Which package should I use for MLE regression?

Sorry for the length and thanks in advance for any assistance someone might 
have; I know your time is valuable.  I have pasted my code below but have also 
attached as a .txt file.

v/r,
Greg
Doctoral Student, 
Dept. of Industrial Eng, Clemson University


R-Code

# upload printing press data
# 27 design points with 3 obs each for 81 total points

press  
   first second third  obs
1 -1 -1-1   34
2  0 -1-1  115
3  1 -1-1  192
4 -1  0-1   82
5  0  0-1   44
6  1  0-1  322
7 -1  1-1  141
8  0  1-1  259
9  1  1-1  290
10-1 -1 0   81
11 0 -1 0   90
12 1 -1 0  319
13-1  0 0  180
14 0  0 0  372
15 1  0 0  541
16-1  1 0  288
17 0  1 0  432
18 1  1 0  713
19-1 -1 1  364
20 0 -1 1  232
21 1 -1 1  408
22-1  0 1  182
23 0  0 1  507
24 1  0 1  846
25-1  1 1  236
26 0  1 1  660
27 1  1 1  878
28-1 -1-1   10
29 0 -1-1  116
30 1 -1-1  186
31-1  0-1   88
32 0  0-1  178
33 1  0-1  350
34-1  1-1  110
35 0  1-1  251
36 1  1-1  280
37-1 -1 0   81
38 0 -1 0  122
39 1 -1 0  376
40-1  0 0  180
41 0  0 0  372
42 1  0 0  568
43-1  1 0  192
44 0  1 0  336
45 1  1 0  725
46-1 -1 1   99
47 0 -1 1  221
48 1 -1 1  415
49-1  0 1  233
50 0  0 1  515
51 1  0 1  535
52-1  1 1  126
53 0  1 1  440
54 1  1 1  991
55-1 -1-1   28
56 0 -1-1  130
57 1 -1-1  263
58-1  0-1   88
59 0  0-1  188
60 1  0-1  350
61-1  1-1   86
62 0  1-1  259
63 1  1-1  245
64-1 -1 0   81
65 0 -1 0   93
66 1 -1 0  376
67-1  0 0  154
68 0  0 0  372
69 1  0 0  396
70-1  1 0  312
71 0  1 0  513
72 1  1 0  754
73-1 -1 1  199
74 0 -1 1  266
75 1 -1 1  443
76-1  0 1  182
77 0  0 1  434
78 1  0 1  640
79-1  1 1  168
80 0  1 1  403
81 1  1 1 1161

#define the response and desired predictor variables (first order only)
y<-press$obs
x1<-press$first
x2<-press$second
x3<-press$third

#develop estimators using least squares regression

reg1<-lm(y~x1+x2+x3,data=press)
summary(reg1)

Call:
lm(formula = y ~ x1 + x2 + x3, data = press)

Residuals:
Min  1Q  Median  3Q Max 
-252.56  -83.24   -5.20   57.33  428.44 

Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept)   314.67  12.74  24.696  < 2e-16 ***
x1177.00  15.61  11.342  < 2e-16 ***
x2109.43  15.61   7.012 7.88e-10 ***
x3131.46  15.61   8.424 1.55e-12 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 

Residual standard error: 114.7 on 77 degrees of freedom
Multiple R-squared: 0.7637, Adjusted R-squared: 0.7544 
F-statistic: 82.93 on 3 and 77 DF,  p-value: < 2.2e-16 

# Now let's fit the Normal Maximum Likelihood model. 
# First, put the data into matrices for the MLE procedure
# I will try the first-model first to see if I can get it to work...

x<- cbind(1,x1,x2,x3)
y<- as.vector(y)
ones<- x[,1]
 
# Calculate K and n for later use

K <- ncol(x)
K
[1] 4
K1 <- K + 1
n