So I've narrowed the issue down to the enum fields in the schema:
{
"name": "action",
"type": {
"name": "ActionEnum",
"type": "enum",
"symbols": ["nextPage", "finish", "onethousandRegister", "userUpdate",
"register"]
}
},
If I create a test schema with just a few string and int
The problem is that avro has it's own representation of union encoding so
your experience would encode to {"int": 50}.
In a recent project we ended up writing a slightly modded json parser to be
able to use avro schemas on existing json rest calls.
2016-02-29 9:20 GMT+01:00 Chris Miller
Hi,
I have a small Hadoop cluster w/ Hive, SparkSQL, etc. setup on Amazon EMR.
I have avro files stored on S3 that I want to be able to access from
SparkSQL. I have confirmed the files are valid and I am able to decode them
using avro-tools.
Here's the full schema for reference:
Did you ever figure this out? I was having the same problem.
--
Chris Miller
On Fri, Feb 19, 2016 at 2:53 AM, Siva wrote:
> Can someone help on this? Is anyone faced similar issue?
>
> Thanks,
> Sivakumar Bhavanari.
>
> On Wed, Feb 17, 2016 at 4:21 PM, Siva