Hi all,

 

I've tried to make a decision tree for the following data set:

 

        Level         X Response
279         C 2.4728646   -9.445
341         B 0.5986398   -9.413
343         B 1.1786271   -9.413
384         D 1.4797870   -9.413
390         C 2.0364569   -9.133
391         D 0.9365739   -9.133
452         A 1.2858741  -11.480
455         C 1.3256245   -9.413
510         C 0.5758865   -9.413
537         D 1.9289431   -9.413
540         C 1.8646144   -9.413
554         B 1.3903752  -10.080

 

Using these commands:

 

> fit=rpart(Response ~ X + Level, data=decTree, method="anova", 
> control=rpart.control(minsplit=1))
> 
> printcp(fit) # display cp table  

Regression tree:
rpart(formula = Response ~ X + Level, data=decTree, method = "anova", 
    control = rpart.control(minsplit = 1))

Variables actually used in tree construction:
character(0)

Root node error: 4.4697/12 = 0.37247

n= 12 

    CP nsplit rel error
1 0.01      0         1


I don't get a tree...

 

> plot(fit) # plot decision tree  
Error in plot.rpart(fit) : fit is not a tree, just a root
> text(fit) # label the decision tree plot  
Error in text.rpart(fit) : fit is not a tree, just a root

 

Can anyone tell me what's going wrong and give a hint how to solve it?

 

Best regards,

 

Joel

 

 
                                          
_________________________________________________________________
Nya Windows 7 - Hitta en dator som passar dig! Mer information. 
http://windows.microsoft.com/shop
        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to