> If, based on its value, OpenSSL may decide that it now got “enough” entropy 
> and doesn’t need to
> pull more from other sources before serving randomness to requestors – then 
> it is harmful.
> “Over-confidence” in this value by the caller can negatively impact the 
> quality of the produced
> random numbers.
    
    As long as you have sources that don't provide 1 bit of randomness
    per bit that you provide you need to have an estimate of how much
    randomness it really contains. And you should probably seriously
    underestimate it so that you're sure that you collect enough.

So let me underestimate it to 0. ;-)  
  
    The problem with ignoring an existing parameter is that people
    could be calling it with for instance the value of 0, knowing it
    contains as good as none entropy. Or they could feed the unwithened
    output of an TRNG in that with an estimate of randomness it provides.
    And OpenSSL used to do the right thing with that.

I *don’t want* OpenSSL to make *any* estimation of the amount of provided 
entropy. All I want it to do is to mix these bits into the RNG state. It’s *my* 
business how much entropy I’m providing – but I don’t want OpenSSL to make any 
decision regarding pull from other entropy sources based on my input.

Does it sound reasonable? (He, it does to me ;)
    
    But now we just ignore it and assume every bit with get contains 1
    bit of randomness and we're sundenly seriously overestimating the
    amount of randomness we're getting.

If I had my way, you’d assume that every bit contains 0 bits of entropy, but 
mix it in regardless because that’s what the user is asking you to do.


    This is a documented public API, you can't just go and ignore this 
parameter.
    
:-)

Attachment: smime.p7s
Description: S/MIME cryptographic signature

-- 
openssl-dev mailing list
To unsubscribe: https://mta.openssl.org/mailman/listinfo/openssl-dev

Reply via email to