Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/16013#discussion_r89686241 --- Diff: core/src/main/scala/org/apache/spark/util/random/SamplingUtils.scala --- @@ -67,20 +67,23 @@ private[spark] object SamplingUtils { } /** - * Returns a sampling rate that guarantees a sample of size >= sampleSizeLowerBound 99.99% of - * the time. + * Returns a sampling rate that guarantees a sample of size greater than or equal to + * sampleSizeLowerBound 99.99% of the time. * * How the sampling rate is determined: + * + * {{{ * Let p = num / total, where num is the sample size and total is the total number of * datapoints in the RDD. We're trying to compute q > p such that - * - when sampling with replacement, we're drawing each datapoint with prob_i ~ Pois(q), - * where we want to guarantee Pr[s < num] < 0.0001 for s = sum(prob_i for i from 0 to total), + * - when sampling with replacement, we're drawing each datapoint with prob_i ~ Pois(q), where + * we want to guarantee Pr[s < num] < 0.0001 for s = sum(prob_i for i from 0 to total), * i.e. the failure rate of not having a sufficiently large sample < 0.0001. * Setting q = p + 5 * sqrt(p/total) is sufficient to guarantee 0.9999 success rate for * num > 12, but we need a slightly larger q (9 empirically determined). * - when sampling without replacement, we're drawing each datapoint with prob_i * ~ Binomial(total, fraction) and our choice of q guarantees 1-delta, or 0.9999 success * rate, where success rate is defined the same as in sampling with replacement. + * }}} --- End diff -- I see, there are lots of ">" in here and they're necessary for clarity. Is there any other option for escaping this besides making the whole thing a code block?
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org