In article <[EMAIL PROTECTED]>,
Konrad Den Ende <[EMAIL PROTECTED]> wrote:
>The function
>L = \sum_{i=1}^b \sum_{j=1}^k (y_ij - \beta_i - \mu_j)^2
>is given. The task is to differentiate it with respect to all b times k
>parameters (which isn't very difficult) and use it to calculate the
>estimates for all the \beta_i's and \mu_j's.

>Any suggestions on how to do that? I've tried different approaches but the
>\beta somehow always gets canceled out and nothing fun comes out of my
>calculations...

I only see apparently b plus k parameters, but there isn one
less.  Adding a constant to all the \beta's and subtracting it
from all the \mu's leaves the problem unchanged.

In fact, this is the classical two-way "analysis of variance"
problem, which uses \beta_i + \mu_j + mean, where mean is the
overall mean, and the \beta's and \mu's are each required to
sum to zero.

-- 
This address is for information only.  I do not claim that these views
are those of the Statistics Department or of Purdue University.
Herman Rubin, Department of Statistics, Purdue University
[EMAIL PROTECTED]         Phone: (765)494-6054   FAX: (765)494-0558
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to