Re: Is it relevant to use BinaryClassificationMetrics.aucROC / aucPR with LogisticRegressionModel ?

2015-11-25 Thread filthysocks
jmvllt wrote > Here, because the predicted class will always be 0 or 1, there is no way > to vary the threshold to get the aucROC, right Or am I totally wrong > ? No, you are right. If you pass a (Score,Label) tuple to BinaryClassificationMetrics, then Score has to be the class probability.

Re: Is it relevant to use BinaryClassificationMetrics.aucROC / aucPR with LogisticRegressionModel ?

2015-11-25 Thread jmvllt
Hi filthysocks, Thanks for the answer. Indeed, using the clearThreshold() function solved my problem :). Regards, Jean. -- View this message in context:

Is it relevant to use BinaryClassificationMetrics.aucROC / aucPR with LogisticRegressionModel ?

2015-11-24 Thread jmvllt
Hi guys, This may be a stupid question. But I m facing an issue here. I found the class BinaryClassificationMetrics and I wanted to compute the aucROC or aucPR of my model. The thing is that the predict method of a LogisticRegressionModel only returns the predicted class, and not the

Re: Is it relevant to use BinaryClassificationMetrics.aucROC / aucPR with LogisticRegressionModel ?

2015-11-24 Thread Sean Owen
Your reasoning is correct; you need probabilities (or at least some score) out of the model and not just a 0/1 label in order for a ROC / PR curve to have meaning. But you just need to call clearThreshold() on the model to make it return a probability. On Tue, Nov 24, 2015 at 5:19 PM, jmvllt