There is ability to do retrieve JSON fields using the convert_to function
in Drill. Check the following doc.

https://drill.apache.org/docs/data-type-conversion/#convert_to-and-convert_from


On Wed, May 25, 2016 at 9:26 AM, Andrew Evans <andrew.ev...@hygenicsdata.com
> wrote:

> Drill Members,
>
> I have an intriguing problem where I have hundreds of thousands or even
> millions of records stored in jsonb format in uneven JSon objects in a
> PostgreSQL database. I would like to be able to implicitly grab column
> names and data types using your tool since neither PostreSQL or Pentaho
> have this function. My current tool for this task is Scala but I am not
> dedicated to writing and keeping up with the JSon format specs; etc. in
> real time.
>
> Is it possible by conversion or otherwise to read the jsonb or a text
> string from PostgreSQL as if it were being queried from a "json" type
> storage instead of a "jdbc" type storage? If so, could I pull in different
> columns from PostgreSQL without some sort of key (with the original query)?
> Is there the ability to do some thing like SELECT pkey,
> split_to_columns(convert_to(field,'JSON')) FROM postgres.mytable?
>
> For context, I have posted an example record below:
>
> pkey       ,  data
> 1423234, "{"loadCaseResult": {"case": {"type": "Adoption (Family)",
> "judge": null, "style": "Confidential", "caseId": "12", "events": {"event":
> [{"date": "08/06/2014", "type": "Request: Action", "comment":
> "900125-Request for Action                \n Filed/Issued By"}}}
>
> Thank you for your time,
>
> Andrew Evans
> Java/Scala/Python Programmer at Hygenics Data,LLC
>

Reply via email to