Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/11982#discussion_r57528218 --- Diff: core/src/main/scala/org/apache/spark/partial/SumEvaluator.scala --- @@ -56,9 +56,12 @@ private[spark] class SumEvaluator(totalOutputs: Int, confidence: Double) val confFactor = { if (counter.count > 100) { new NormalDistribution().inverseCumulativeProbability(1 - (1 - confidence) / 2) - } else { + } else if (counter.count > 1) { val degreesOfFreedom = (counter.count - 1).toInt new TDistribution(degreesOfFreedom).inverseCumulativeProbability(1 - (1 - confidence) / 2) + } else { + // No way to meaningfully estimate confidence, so we signal no particular confidence interval + Double.PositiveInfinity --- End diff -- Still don't think this works since you're multiplying infinity by NaN to get the intervals. It's too odd to begin with but results in NaN anyway.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org