Hi Vibhath,

that is the usual problem with floating point numbers. The only ways to fix that I can imagine:

* store them as numeric/decimal values in your postgres db (might be hard to apply at this stage), or * you might be able to round and cast them in your postgres-query to numeric/decimal values (so fix the number of digits) and apply an appropriate (avro) schema, or * in NiFi round to the exact number of digits and store as a string or numeric/decimal before saving to csv

For the latter to work reliably, you will likely need a single processor (like ExecuteScript/ScriptedTransformRecord, or a custom processor) so that you can both round and store the numbers in a single session. If you split that up into multiple processors, you might end up with the same situation again.

This will of course only work, if you know the exact number of digits in advance, since that can't be deduced from a floating point representation on its own.

I'm sorry if I have missed something.

Best,
Lars

On 8/26/21 5:12 PM, Vibhath Ileperuma wrote:
Hi All,

I have created a Nifi flow to query from Postgresql database and write data into csv files. However, I noticed that the floating point values (Double values) can be changed slightly when writing to csv files. For an example, value 4313681553.3 was written as 4313681553.2999992. Since some of the values I'm extracting are very sensitive, I'm wondering if someone can suggest a way to extract the data exactly as they are.

Thank You
Vibhath



Attachment: OpenPGP_signature
Description: OpenPGP digital signature

Reply via email to