On Thu, Nov 25, 2010 at 1:07 AM, ivek gimmick <[email protected]> wrote:

> Hi guys,
>
>    I have a question regarding Bayes classifier.  My case is to classify
> text as good or bad.
>
>    My question is, do I have to train the classifier with both good and bad
> examples?




Yes . It will help you to get better results

> Or is it enough to train with either of good or bad.?
>
>

It will be something like train a person to identify 'sweet' by giving
'salt' as sample


>    For eg., I have only bad data.  I train the classifier with this data
> and generate a model.  If I try this on a new set of data, will it classify
> something that is not bad?  If yes, what would be its label??
>



-- 
**********************************
JAGANADH G
http://jaganadhg.freeflux.net/blog

Reply via email to