[
https://issues.apache.org/jira/browse/AVRO-2198?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17929126#comment-17929126
]
Oscar Westra van Holthe - Kind commented on AVRO-2198:
------------------------------------------------------
The issue here is that the parsed value of "amount" is not parsed as a decimal
number, but as a UTF-8 encoded series of bytes denoting the binary encoding of
the decimal value.
This means that the code works "correctly", but not as expected. Closing this
as a duplicate of AVRO-2087 ("Allow specifying default values for logical types
in human-readable form")
> BigDecimal (logical type=Decimal / type = bytes) GenericRecord to
> SpecificRecord Conversion Issue
> -------------------------------------------------------------------------------------------------
>
> Key: AVRO-2198
> URL: https://issues.apache.org/jira/browse/AVRO-2198
> Project: Apache Avro
> Issue Type: Bug
> Components: java
> Affects Versions: 1.8.2
> Environment: !image-2018-07-12-13-02-20-961.png!
> Reporter: ABourg
> Priority: Major
> Fix For: 1.8.2
>
>
> There seems to be an issue with the conversion process from a byte array to a
> BigDecimal when converting to a SpecificRecord from a GenericRecord object.
> Below is a simple avro definition with "amount" defined as logical type
> *decimal* and type *bytes*. The avroData specific class has been generated
> with enablebigdecimal = true.
> See below example.
> An amount value of *20000000.11* is converted to BigDecimal value of
> *606738534879530359915932.65*
> {code:java}
> String schema =
> "{\"type\":\"record\",\"name\":\"avroTrans\",\"namespace\":\"com.demo.KafkaStream\",\"fields\":[{\"name\":\"amount\",\"type\":{\"type\":\"bytes\",\"logicalType\":\"decimal\",\"precision\":5,\"scale\":2}}]}";
> String json = "{\"amount\": \"20000000.11\"}";
> Schema avroSchema = new Schema.Parser().parse(schema);
> GenericRecord obj = Utils.jsonToAvro(json, avroSchema);
> System.out.println("GenericRecord Object Value ->" + obj);
> GenericDatumWriter<GenericRecord> writer = new
> GenericDatumWriter<GenericRecord>(avroTrans.getClassSchema());
> ByteArrayOutputStream out = new ByteArrayOutputStream();
> Encoder encoder = EncoderFactory.get().binaryEncoder(out, null);
> writer.write(obj, encoder);
> encoder.flush();
> byte[] avroData = out.toByteArray();
> out.close();
> SpecificDatumReader<avroTrans> reader2 = new
> SpecificDatumReader<avroTrans>(avroTrans.class);
> Decoder decoder2 = DecoderFactory.get().binaryDecoder(avroData, null);
> avroTrans customRecord = reader2.read(null, decoder2);
> System.out.println("SpecificRecord Object Value -> " + customRecord);
> {code}
> *Output:*
> GenericRecord Object Value ->\{"amount": {"bytes": "20000000.11"}}
> SpecificRecord Object Value -> \{"amount": 606738534879530359915932.65}
> Within *org.apache.avro.Conversion* there is a fromBytes conversion method
> which takes a bytebuffer as input (see below).
>
> {code:java}
> @Override
> public BigDecimal fromBytes(ByteBuffer value, Schema schema, LogicalType
> type) {
> int scale = ((LogicalTypes.Decimal) type).getScale();
> // always copy the bytes out because BigInteger has no offset/length
> ctor
> byte[] bytes = new byte[value.remaining()];
> value.get(bytes);
> return new BigDecimal(new BigInteger(bytes), scale);
> }
> {code}
> The BigInteger constructor (bytes) used version is to "*_Translates a byte
> array containing the two's-complement binary representation of a BigInteger
> into a BigInteger_*."
> Could use of the BigInteger(Bytes) causing the incorrect conversion to huge
> number?
--
This message was sent by Atlassian Jira
(v8.20.10#820010)