Re: Json.ObjectWriter - "Not the Json schema"

2016-02-24 Thread Prajwal Tuladhar
Can you paste your avro IDL schema?

On Wed, Feb 24, 2016 at 7:46 AM, tl  wrote:

> Hi again,
>
> I still haven’t found a solution to this problem. Does this look like some
> beginners Java mistake (because that may well be…)? Is it okay to ask the
> same question on stackoverflow or would that count as crossposting/spamming?
>
> Cheers,
> Thomas
>
>
> > On 23.02.2016, at 02:22, tl  wrote:
> >
> > Hi,
> >
> >
> > I want to convert incoming data to Avro and JSON (and later Parquet).
> Avro conversion is working okay, but JSON conversion throws the following
> error that I don’t understand:
> >
> > Exception in thread "main" java.lang.RuntimeException: Not the Json
> schema:
> {"type":"record","name":"Torperf","namespace":"converTor.torperf","fields":[{"name":"descriptor_type","type":"string","default":"torperf
> 1.0"},
> > [ … omitted for brevity …]
> >
> {"name":"circ_id","type":["null","int"],"doc":"metrics-lib/TorperfResult:
> int
> getCircId()"},{"name":"used_by","type":["null","int"],"doc":"metrics-lib/TorperfResult:
> int getUsedBy()"}],"aliases":["torperfResult"]}
> >   at org.apache.avro.data.Json$ObjectWriter.setSchema(Json.java:117)
> >   at converTor.WriterObject.(WriterObject.java:116)
> >   at converTor.TypeWriter.get(TypeWriter.java:31)
> >   at converTor.ConverTor.main(ConverTor.java:249)
> >   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >   at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >   at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >   at java.lang.reflect.Method.invoke(Method.java:606)
> >   at
> com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
> >
> > … but that schema is indeed the schema that I want to use.
> >
> >
> > This is a snippet of my code:
> >
> > File schemaFile = new File("schema/jsonSchema.avsc");
> > Schema.Parser parser = new Schema.Parser();
> > Schema mySchema = parser.parse(schemaFile) ;
> >
> > Json.ObjectWriter jsonDatumWriter = new Json.ObjectWriter();
> > jsonDatumWriter.setSchema(mySchema);
> > OutputStream out = new FileOutputStream(outputFile);
> > Encoder encoder = EncoderFactory.get().jsonEncoder(mySchema, out);
> >
> >
> > Can somebody give me a hint?
> >
> >
> > Thanks,
> > Thomas
>
>
>
>
>


-- 
--
Cheers,
Praj


Re: Avro schema doesn't honor backward compatibilty

2016-02-01 Thread Prajwal Tuladhar
Hi,

I think your usage of default for field "agentType" is invalid here.

When generating code from invalid schema, it tends to fail:

[INFO]
> [INFO] --- avro-maven-plugin:1.7.6-cdh5.4.4:schema (default) @ test-app ---
> [WARNING] Avro: Invalid default for field agentType: "APP_AGENT" not a
> ["null","string"]


Try:

{
>  "namespace": "xx..x.x",
>  "type": "record",
>  "name": "MyPayLoad",
>  "fields": [
>  {"name": "filed1",  "type": "string"},
>  {"name": "filed2", "type": "long"},
>  {"name": "filed3",  "type": "boolean"},
>  {
>   "name" : "metrics",
>   "type":
>   {
>  "type" : "array",
>  "items":
>  {
>  "name": "MyRecord",
>  "type": "record",
>  "fields" :
>  [
>{"name": "min", "type": "long"},
>{"name": "max", "type": "long"},
>{"name": "sum", "type": "long"},
>{"name": "count", "type": "long"}
>  ]
>  }
>   }
>  },
>  {"name": "agentType",  "type": ["null", "string"], "default": null}
>   ]
> }





On Mon, Feb 1, 2016 at 8:31 PM, Raghvendra Singh 
wrote:

>
>
> down votefavorite
> 
>
> I have this avro schema
>
> {
>  "namespace": "xx..x.x",
>  "type": "record",
>  "name": "MyPayLoad",
>  "fields": [
>  {"name": "filed1",  "type": "string"},
>  {"name": "filed2", "type": "long"},
>  {"name": "filed3",  "type": "boolean"},
>  {
>   "name" : "metrics",
>   "type":
>   {
>  "type" : "array",
>  "items":
>  {
>  "name": "MyRecord",
>  "type": "record",
>  "fields" :
>  [
>{"name": "min", "type": "long"},
>{"name": "max", "type": "long"},
>{"name": "sum", "type": "long"},
>{"name": "count", "type": "long"}
>  ]
>  }
>   }
>  }
>   ]}
>
> Here is the code which we use to parse the data
>
> public static final MyPayLoad parseBinaryPayload(byte[] payload) {
> DatumReader payloadReader = new 
> SpecificDatumReader<>(MyPayLoad.class);
> Decoder decoder = DecoderFactory.get().binaryDecoder(payload, null);
> MyPayLoad myPayLoad = null;
> try {
> myPayLoad = payloadReader.read(null, decoder);
> } catch (IOException e) {
> logger.log(Level.SEVERE, e.getMessage(), e);
> }
>
> return myPayLoad;
> }
>
> Now i want to add one more field int the schema so the schema looks like
> below
>
>  {
>  "namespace": "xx..x.x",
>  "type": "record",
>  "name": "MyPayLoad",
>  "fields": [
>  {"name": "filed1",  "type": "string"},
>  {"name": "filed2", "type": "long"},
>  {"name": "filed3",  "type": "boolean"},
>  {
>   "name" : "metrics",
>   "type":
>   {
>  "type" : "array",
>  "items":
>  {
>  "name": "MyRecord",
>  "type": "record",
>  "fields" :
>  [
>{"name": "min", "type": "long"},
>{"name": "max", "type": "long"},
>{"name": "sum", "type": "long"},
>{"name": "count", "type": "long"}
>  ]
>  }
>   }
>  }
>  {"name": "agentType",  "type": ["null", "string"], "default": 
> "APP_AGENT"}
>   ]}
>
> Note the filed added and also the default is defined. The problem is that
> if we receive the data which was written using the older schema i get this
> error
>
> java.io.EOFException: null
> at org.apache.avro.io.BinaryDecoder.ensureBounds(BinaryDecoder.java:473) 
> ~[avro-1.7.4.jar:1.7.4]
> at org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:128) 
> ~[avro-1.7.4.jar:1.7.4]
> at org.apache.avro.io.BinaryDecoder.readIndex(BinaryDecoder.java:423) 
> ~[avro-1.7.4.jar:1.7.4]
> at 
> org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:229) 
> ~[avro-1.7.4.jar:1.7.4]
> at org.apache.avro.io.parsing.Parser.advance(Parser.java:88) 
> ~[avro-1.7.4.jar:1.7.4]
> at 
> org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:206) 
> ~[avro-1.7.4.jar:1.7.4]
> at 
> org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152) 
> ~[avro-1.7.4.jar:1.7.4]
> at 
> org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:177)
>  ~[avro-1.7.4.jar:1.7.4]
> at 
> org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:148) 
>