Hi Maksym.

If you only want the loss to be reweighted according to class, you can simply use sample_weights to give more emphasis to the samples of this class. If you want some other loss function, you might need to specify your own splitting criterion.

Cheers,
Andy


On 09/16/2014 08:39 AM, Maksym Ganenko wrote:
Dear community,

Is there a hope that random forest with different misclassification cost will be implemented in scikit-learn? I mean different cost for false positives and false negatives. Like this:

http://stats.stackexchange.com/questions/18938/how-to-make-a-randomforest-algorithm-cost-sensitive

I'm quite new to machine learning, so may be there's some simple trick to implement different cost using existing methods?

Maksym


------------------------------------------------------------------------------
Want excitement?
Manually upgrade your production database.
When you want reliability, choose Perforce.
Perforce version control. Predictably reliable.
http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk


_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

------------------------------------------------------------------------------
Want excitement?
Manually upgrade your production database.
When you want reliability, choose Perforce
Perforce version control. Predictably reliable.
http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to