Yeah, I was able to get json into the db it by using strings.

Unfortunately, I have some escape characters in my strings, and it looks like I 
have to escape my escapes. Which ends up being either a few text processors or 
a groovy script.

To paraphrase the meme, "yo dawg I hear you like escapes with your escapes..."

But you are correct, I was able to make it happen. I was just hoping for 
something a little more record-oriented (or something).  I guess if it works, 
don't complain...

-----Original Message-----
From: Mike Thomsen [mailto:mikerthom...@gmail.com] 
Sent: Friday, December 16, 2022 10:59 AM
To: users@nifi.apache.org
Subject: [EXTERNAL] Re: json into a json-enabled DB

EXT email: be mindful of links/attachments.



To Matt's point, I've tested insert by doing a record field of String going to 
JSON/JSONB in Postgres and MySQL, and that worked just fine.
I'm not sure if we're at a point where we can do a reader with one schema and a 
writer with another schema, but it should be pretty straight forward to fix so 
that worst case scenario that is ConvertRecord -> PutDatabaseRecord

On Thu, Dec 15, 2022 at 10:21 PM Matt Burgess <mattyb...@apache.org> wrote:
>
> Geoffrey,
>
> The biggest problem with JSON columns across the board is that the 
> JDBC and java.sql.Types specs don't handle them natively, and NiFi 
> records don't recognize JSON as a particular type, we are only 
> interested in the overall datatype such as String since NiFi records 
> can be in any supported format. In my experience these are handled by 
> setting the JSON column to type java.sql.OTHER (like PostgreSQL) and 
> they are willing to accept the value as a String (see NIFI-5901 [1]), 
> and we put in code to handle it as such (see NIFI-5845 [2]). For NiFi 
> it's been more of an ad-hoc type of support where maybe if the SQL 
> type is custom and unique we can handle such things (like sql_variant 
> in MSSQL via NIFI-5819 [3]), but due to the nature of the custom type 
> it's difficult to handle in any sort of consistent way. Happy to hear 
> your thoughts and input, perhaps we can add some ad-hoc support for 
> your use case?
>
> Regards,
> Matt
>
> [1] https://issues.apache.org/jira/browse/NIFI-5901
> [2] https://issues.apache.org/jira/browse/NIFI-5845
> [3] https://issues.apache.org/jira/browse/NIFI-5819
>
> On Wed, Dec 14, 2022 at 3:55 PM Greene (US), Geoffrey N 
> <geoffrey.n.gre...@boeing.com> wrote:
> >
> > Some databases (postgres, sql server,  others) support native json columns.
> >
> > With postgres, there’s a native jsonb type, with sql server it’s a string 
> > type, that you can treat as json.
> >
> >
> >
> > In any event, once you have the json in the database, one can then query 
> > it, e.g.:
> >
> >
> >
> > SELECT id,product_name,
> >
> >        JSON_VALUE(attributes, '$.material') AS material
> >
> > FROM jsontest;
> >
> >
> >
> > So, here’s my question:
> >
> >
> >
> > If you have a flow file that contains json, whats the best way to insert 
> > that into a database?
> >
> > The only thing I’ve thought of so far is if you have the json string
> >
> > {“material” : “plastic”}
> >
> > You then use a TEXT processor to turn that into
> >
> > {“attributes”: {‘{“material” : “plastic”}’}
> >
> > And then use a PutDatabaseRecord to actually write the entry.
> >
> >
> >
> > Is there a better, or more efficient way to do it?
> >
> >
> >
> >
> >
> >

Reply via email to