Dear Michael,
Please see comments below, interspersed with your questions:
On Wed, 4 Sep 2013 22:18:57 -0400
"Michael Hacker" wrote:
> Dear Colleagues,
>
>
>
> I'm working on a Delphi study comparing perceptions of high school
> technology teachers and university engineering educators about
Hi Michael,
See comments in line.
On Wed, Sep 4, 2013 at 10:18 PM, Michael Hacker wrote:
> Dear Colleagues,
>
>
>
> I'm working on a Delphi study comparing perceptions of high school
> technology teachers and university engineering educators about the
> importance of concepts about engineering f
Dear Colleagues,
I'm working on a Delphi study comparing perceptions of high school
technology teachers and university engineering educators about the
importance of concepts about engineering for HS students to learn as part of
their fundamental education. I'm actually doing this as part of my
Hello List,
I am trying to do the following:
1. Use a BandPassFilter h[n] on a series x[n]
2. Then Decimate the series by a factor D such that y[k]=x[k*D]
The decimation factor is considerable (10,000) so that filtering before the
decimation, using convolve(), seems foolish.
Can you think of an R
Hi Michael,
If I remember right, this question has been asked several times on
this mailing list in the past. The reference listed in the help page
for poly explain how to get the un-orthogonalized coefficients, but
those coefficients aren't needed for prediction. For more details,
though, sear
How can I get the result of, e.g., poly(1:3. degree=2) to give me the
unnormalized integer coefficients
usually used to explain orthogonal polynomial contrasts, e.g,
-1 1
0 -2
1 1
As I understand things, the columns of x^{1:degree} are first centered
and then
are normalized by 1/sqrt(col
Hi All,
I was hoping someone could save me the trouble of reading through source code
and answer a quick question of mine regarding poly(). Does the poly() function
use a classical orthogonal polynomial series to fit polynomial models, or does
poly() generate a unique series of orthogonal poly
On Jul 17, 2009, at 5:25 PM, Ulrike Grömping wrote:
David Winsemius wrote:
On Jul 17, 2009, at 3:24 PM, Ulrike Grömping wrote:
David,
thanks. Your explanation does not quite fit, though, as it refers to
using
function data.frame, while I assigned the new column with $<-.
poly() does
retu
David Winsemius wrote:
>
>
> On Jul 17, 2009, at 3:24 PM, Ulrike Grömping wrote:
>
>>
>> David,
>>
>> thanks. Your explanation does not quite fit, though, as it refers to
>> using
>> function data.frame, while I assigned the new column with $<-.
>> poly() does
>> return an object of classe
On Jul 17, 2009, at 3:24 PM, Ulrike Grömping wrote:
David,
thanks. Your explanation does not quite fit, though, as it refers to
using
function data.frame, while I assigned the new column with $<-.
poly() does
return an object of classes poly and matrix, not model.matrix,
But model.matr
David,
thanks. Your explanation does not quite fit, though, as it refers to using
function data.frame, while I assigned the new column with $<-. poly() does
return an object of classes poly and matrix, not model.matrix, and handing a
poly object to function data.frame does behave like I would ex
Dataframes are lists. Look at dat with str and you will see that the
third column (actually the third list element) is a matrix. It's not
hard to find the documentation. If you read the documentation on the
help page for data.frame you should see this:
"If a list or data frame or matrix is
Dear UseRs,
I just learnt that the number of columns of a data frame is not always what
I thought it to be, and I wonder where I should have learnt about this.
Consider the following example:
dat <- data.frame(X1=1:10, X2=LETTERS[1:10])
ncol(dat) ## evaluates to 2 (of course)
dat$X1poly
: [R] poly regression
They have different coefficients because their model matrices are different
but they both lead to the same predictions:
> fitted(lm(y~1+x+I(x^2)))
1 2 3 4 5 6 7 8 9 10
1 4 9 16 25 36 49 64 81 100
> fitted(lm(y~poly(x,2)))
1 2 3 4
They have different coefficients because their model matrices
are different but they both lead to the same predictions:
> fitted(lm(y~1+x+I(x^2)))
1 2 3 4 5 6 7 8 9 10
1 4 9 16 25 36 49 64 81 100
> fitted(lm(y~poly(x,2)))
1 2 3 4 5 6 7 8 9 10
1
hi,
I want to do a polynomial regression of y on x of degree 2, as following
> x<-1:10
> y<-x^2
> lm(y~poly(x,2))
Call:
lm(formula = y ~ poly(x, 2))
Coefficients:
(Intercept) poly(x, 2)1 poly(x, 2)2
38.5099.9122.98
Which is not what i had expected.
If I wrote the expre
16 matches
Mail list logo