trip...@gmail.com writes:

>  am trying to round off values in a dict to 2 decimal points but
>  have been unsuccessful so far. The input I have is like this:
> 
>     y = [{'a': 80.0, 'b': 0.0786235, 'c': 10.0, 'd': 10.6742903},
>     {'a': 80.73246, 'b': 0.0, 'c': 10.780323, 'd': 10.0}, {'a':
>     80.7239, 'b': 0.7823640, 'c': 10.0, 'd': 10.0}, {'a':
>     80.7802313217234, 'b': 0.0, 'c': 10.0, 'd': 10.9762304}]
> 
> I want to round off all the values to two decimal points using the
> ceil function. Here's what I have:
> 
>     def roundingVals_toTwoDeci():
>         global y
>         for d in y:
>             for k, v in d.items():
>                 v = ceil(v*100)/100.0
>         return
>     roundingVals_toTwoDeci()
> 
> But it is not working - I am still getting the old values.

You are assigning to a local variable, v. Instead, store the new
values back to the dict like this:

   d[k] = ceil(v*100)/100.0

And you don't need to declare y global. It would only be needed if you
assigned directly to it, as in y = ... (usually not a good idea).

The rounding may not work the way you expect, because float values are
stored in binary. You may need a decimal type, or you may need to
format the output when printing instead.
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to