Hi Timo,

Thanks for your response !

I have define my Avro schema in "toKafka.avsc" and create my "toKafka.java"
file with :

*#java -jar avro-tools-1.8.2.jar compile schema toKafka.avsc*

Then i import Avro Serialize Schema and my "toKafka.java" generated file :

*import com.nybble.alpha.AvroDeserializationSchema;*
*import com.nybble.alpha.toKafka;*

Do i miss something ?

Regards,
Sebastien.


2018-04-25 11:32 GMT+02:00 Timo Walther <twal...@apache.org>:

> Hi Sebastien,
>
> for me this seems more an Avro issue than a Flink issue. You can ignore
> the shaded exception, we shade Google utilities for avoiding depencency
> conflicts.
>
> The root cause is this:
>
> java.lang.NullPointerException
>     at org.apache.avro.specific.SpecificData.getSchema
> (SpecificData.java:227)
>
> And the corresponding lines look like this:
>
>   /** Find the schema for a Java type. */
>   public Schema getSchema(java.lang.reflect.Type type) {
>     try {
>       return schemaCache.get(type);
>     } catch (Exception e) {
>       throw (e instanceof AvroRuntimeException) ? // line 227
>           (AvroRuntimeException)e.getCause() : new
> AvroRuntimeException(e);
>     }
>   }
>
> So I guess your schema is missing.
>
> I hope this helps.
>
> Regards,
> Timo
>
> Am 25.04.18 um 10:57 schrieb Lehuede sebastien:
>
>> ava.lang.NullPointerException
>>     at org.apache.avro.specific.SpecificData.getSchema
>>
>
>
>

Reply via email to