[ 
https://issues.apache.org/jira/browse/AVRO-2471?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16918639#comment-16918639
 ] 

Nandor Kollar commented on AVRO-2471:
-------------------------------------

I think this brings up a broader question: how to handle logical types when 
there's more than one logical type mapping registered for the same Java type 
(in this case for {{Instant}})? For generic and specific data the actual 
conversion to be used can be differentiated based on the logical type in the 
schema, but how do we resolve the ambiguity for reflective data, where the 
schema is inferred from the Java types, and there's no additional logical type 
information available? Should it default to the first match in the registered 
conversions, or should it throw an error, and require explicit {{AvroSchema}} 
annotation?

> Java maven plugin code generation doesn't add conversion for timestamp-micros
> -----------------------------------------------------------------------------
>
>                 Key: AVRO-2471
>                 URL: https://issues.apache.org/jira/browse/AVRO-2471
>             Project: Apache Avro
>          Issue Type: Bug
>          Components: java
>    Affects Versions: 1.9.0
>            Reporter: Marek Tracz
>            Priority: Major
>
> Field in schema: (there is no single field with timestamp-millis logical type)
> {code:java}
> {
>   "name": "RECORDING_TIME",
>   "type": [
>     "null",
>     {
>       "type": "long",
>       "logicalType": "timestamp-micros"
>     }
>   ],
>   "default": null
> }
> {code}
> Maven plugin configuration:
> {code:xml}
> <plugin>
>       <groupId>org.apache.avro</groupId>
>       <artifactId>avro-maven-plugin</artifactId>
>       <version>1.9.0</version>
>       <executions>
>               <execution>
>                       <goals>
>                               <goal>schema</goal>
>                       </goals>
>                       <configuration>
>                               <stringType>String</stringType>
>                               
> <enableDecimalLogicalType>true</enableDecimalLogicalType>
>                               
> <sourceDirectory>${project.basedir}/src/main/resources/</sourceDirectory>
>                       </configuration>
>               </execution>
>       </executions>
> </plugin>
> {code}
> Part of the generated class:
> {code:java}
>   private static SpecificData MODEL$ = new SpecificData();
> static {
>     MODEL$.addLogicalTypeConversion(new 
> org.apache.avro.data.TimeConversions.DateConversion());
>     MODEL$.addLogicalTypeConversion(new 
> org.apache.avro.data.TimeConversions.TimestampMillisConversion()); // <--- 
> this should be TimestampMicrosConversion
>     MODEL$.addLogicalTypeConversion(new 
> org.apache.avro.Conversions.DecimalConversion());
>   }
> {code}
> For example this code:
> {code:java}
> Data data = Data.newBuilder()
>               .setRECORDINGTIME(Instant.now())
>                 .build();
> {code}
> Fails during comparison:
> {noformat}
> org.apache.kafka.common.errors.SerializationException: Error serializing Avro 
> message
> Caused by: org.apache.avro.AvroRuntimeException: Unknown datum type 
> java.time.Instant: 2019-07-12T14:24:47.322Z
>       at 
> org.apache.avro.generic.GenericData.getSchemaName(GenericData.java:887)
>       at 
> org.apache.avro.specific.SpecificData.getSchemaName(SpecificData.java:420)
>       at 
> org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:850)
>       at 
> org.apache.avro.generic.GenericDatumWriter.resolveUnion(GenericDatumWriter.java:249)
>       at 
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:142)
>       at 
> org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:98)
>       at 
> org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:195)
>       at 
> org.apache.avro.specific.SpecificDatumWriter.writeRecord(SpecificDatumWriter.java:83)
>       at 
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:130)
>       at 
> org.apache.avro.specific.SpecificDatumWriter.writeField(SpecificDatumWriter.java:98)
>       at 
> org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:195)
>       at 
> org.apache.avro.specific.SpecificDatumWriter.writeRecord(SpecificDatumWriter.java:83)
>       at 
> org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:130)
>       at 
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:82)
>       at 
> org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:72)
>       at 
> io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:92)
>       at 
> io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53)
>       at 
> org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:65)
>       at 
> org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:55)
>       at 
> org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:841)
>       at 
> org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:803)
>       at 
> org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:690)
> {noformat}
> When manually changed to 
> *org.apache.avro.data.TimeConversions.TimestampMicrosConversion* everything 
> works properly.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

Reply via email to