Re: [R] Need help on nnet

2011-06-28 Thread arun
Hi Georg,

I am new to R and I am curious if there is a simple way to do the feature
selection you described:

"feature selection is essentially an exhaustive approach which tries
every possible subset of your predictors, trains a network and sees what
the prediction error is. The subset which is best (lowest error) is then
chosen in the end. It normally (as a side-effect) also gives you something
like an importance ranking of the variables when using backward or forward
feature selection. But be careful of interactions between variables. "

Is it an option with nnet or should I use leaps in conjunction with nnet ?

Thanks,
Arun



--
View this message in context: 
http://r.789695.n4.nabble.com/Need-help-on-nnet-tp3081744p3630984.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Need help on nnet

2010-12-13 Thread Georg Ruß
On 10/12/10 02:56:13, jothy wrote:
> Am working on neural network.
> Below is the coding and the output [...]

> > summary (uplift.nn)
>
> a 3-3-1 network with 16 weights
>
> options were -
>
>   b->h1  i1->h1  i2->h1  i3->h1
>   16.646.62  149.932.24
>   b->h2  i1->h2  i2->h2  i3->h2
>  -42.79  -17.40 -507.50   -5.14
>   b->h3  i1->h3  i2->h3  i3->h3
>3.451.87   18.890.61
>b->o   h1->o   h2->o   h3->o
>  402.81   41.29  236.766.06

> Q1: How to interpret the above output

The summary above is the list of internal weights that were learnt during
the neural network training in nnet(). From my point of view I wouldn't
really try to interpret any meaning into those weights, especially if you
have multiple predictor variables.

> Q2: My objective is to know the contribution of each independent variable.

You may try something like variable importance approaches (VI) or feature
selection approaches. 

1) In VI you have a training and test set as in normal cross-validation.
You train your network on the training set. You use the trained network
for predicting the test values. The clue in VI then is to pick one
variable at a time, permute its values in the test set only (!) and see
how much the prediction error deviates from the original prediction error
on the unpermuted test set.  Repeat this a lot of times to get a
meaningful output and also be sure to use a lot of cross-validation
permutations. The more the prediction error rises, the more important the
respective variable was/is. This approach includes interactions between
variables.

2) feature selection is essentially an exhaustive approach which tries
every possible subset of your predictors, trains a network and sees what
the prediction error is. The subset which is best (lowest error) is then
chosen in the end. It normally (as a side-effect) also gives you something
like an importance ranking of the variables when using backward or forward
feature selection. But be careful of interactions between variables.

> Q3: Which package of neural network provides the AIC or BIC values

You may try training with the multinom() function, as pointed out in
msg09297:
http://www.mail-archive.com/r-help@stat.math.ethz.ch/msg09297.html

I hope I could point out some keywords and places to look at.

Regards,
Georg.
-- 
Research Assistant
Otto-von-Guericke-Universität Magdeburg
resea...@georgruss.de
http://research.georgruss.de

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Need help on nnet

2010-12-10 Thread jothy

Hi,


Am working on neural network.

Below is the coding and the output

> library (nnet)

> uplift.nn<-nnet (PVU~ConsumerValue+Duration+PromoVolShare,y,size=3)

# weights:  16

initial  value 4068.052704

final  value 3434.194253

converged

> summary (uplift.nn)

a 3-3-1 network with 16 weights

options were -

  b->h1  i1->h1  i2->h1  i3->h1

  16.646.62  149.932.24

  b->h2  i1->h2  i2->h2  i3->h2

 -42.79  -17.40 -507.50   -5.14

  b->h3  i1->h3  i2->h3  i3->h3

   3.451.87   18.890.61

   b->o   h1->o   h2->o   h3->o

 402.81   41.29  236.766.06

I have few questions, please i need help
Q1: How to interpret the above output
Q2: My objective is to know the contribution of each independent variable.
Q3: Which package of neural network provides the AIC or BIC values


Regards
jothy

-- 
View this message in context: 
http://r.789695.n4.nabble.com/Need-help-on-nnet-tp3081744p3081744.html
Sent from the R help mailing list archive at Nabble.com.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.