Github user sethah commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16149#discussion_r91021502
  
    --- Diff: 
mllib/src/main/scala/org/apache/spark/ml/regression/GeneralizedLinearRegression.scala
 ---
    @@ -479,7 +479,12 @@ object GeneralizedLinearRegression extends 
DefaultParamsReadable[GeneralizedLine
             numInstances: Double,
             weightSum: Double): Double = {
           -2.0 * predictions.map { case (y: Double, mu: Double, weight: 
Double) =>
    -        weight * dist.Binomial(1, mu).logProbabilityOf(math.round(y).toInt)
    +        val wt = math.round(weight).toInt
    +        if (wt == 0) {
    +          0.0
    +        } else {
    +          dist.Binomial(wt, mu).logProbabilityOf(math.round(y * 
weight).toInt)
    --- End diff --
    
    So I think the real issue here is that we don't currently allow users to 
specify a binomial GLM using success/outcome pairs. One way to mash that kind 
of grouped data into the format Spark requires is using the process described 
above by @actuaryzhang, but then we need to adjust the log-likelihood 
computation as was also noted. 
    
    So @srowen is correct in saying that this is inaccurate for non-integer 
weights. I checked with R's glmnet, and it seems that they obey the semantics 
of data weights for a binomial GLM corresponding to the number of successes. So 
they log a warning when you input data weights of non-integer values, then 
proceed with the method proposed in this patch. 
    
    So, this actually _does_ match R's behavior and I am in favor of the 
change. But we need to log appropriate warnings and write good unit tests. What 
are others' thoughts?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to