also this doesn't help, there are still errors in the accuracy. Isn't there a perfect way to do such calculations?
Karsten Goen <[email protected]> wrote: > > hey all, > > I got a problem with floats and calculations. I made an mini-application > where > > you get random questions with some science calculations in it > > So the user can type in his result with the values given by random > creation. > > And the user value is compared against the computer value... the problem > is > > that the user input is only 2 numbers behind the '.' so like 1.42, 1.75 > > > > here is the example: > > http://dpaste.com/hold/158698/ > > > > without decimal it would be very inaccurate. decimal is very accurate > when I > > have to compare d with users calculations from a,b,c,var. > > But when I ask the user what is "a" the result gets inaccurate when > calculating > > with the same values given before (b,c,d,var). > > > > Maybe anyone can help me with this problem, I don't want to generate for > every > > possible user input a single formula. And also it should be possible for > a > > computer, my calculator at home does the same and is much smaller and > slower. > > d = (a * b)/ (c * var) > d = Decimal(d).quantize(Decimal('0.01')) > > By quantizing d, the above equality does not hold any longer. You've got > to drop that line (your calculator doesn't quantize either). > > > Stefan Krah >
-- http://mail.python.org/mailman/listinfo/python-list
