Hi Jens,
I have a flow that does that exact work – and also has to handle removal of
special characters, with the only difference that I use a stored proc so that I
can fix the content before storing it. The ReplaceText-->PutSQL has worked
extremely well, even for large (50mb) text chunks.
The syntax to use for calling the postgres stored proc/function is similar to
this (I pass in many more metadata params), and you can see the $1 text content:
select * from docs."DocTextSet" ('${original_filename_fixed}',
'${original_file_extension}', '$1');
looks like this is the ReplaceText processor:
[cid:image001.png@01D75D0B.3CB99E20]
Within the stored proc, I use a variety of Replace and other functions to fix
the content and push a success/fail back to the flow to disposition the source
file.
Mike
From: Jens M. Kofoed
Sent: Wednesday, June 9, 2021 5:20 AM
To: users@nifi.apache.org
Subject: Need help to insert complete content of a file into JSONB field in
postgres
Dear community
I'm struggling with inserting files into a Postgres DB. It is the whole content
of a file which has to be inserted into one field in one record.
Not each line/record in the field.
The PutSQL process expect the content of an incoming FlowFile to be the SQL
command to execute. Not data to add to the database.
I managed to use a ReplaceText Process to alter the content to include the:
INSERT INTO tablename (content)
VALUES ('$1')
where $1 is equal to the whole content. But the content has special characters
and is failing.
Please any advice how to insert all content of a file into one record/field?
kind regards
Jens