Hello,
We are working with NIFI 1.9.0, There is issue of web page showing. I have a
process group’s configuration include a DBCPConnectionPoolLookup and the
DBCPConnectionPoolLookup associate 170+ DBCPConnectionPool configurations in
the same page. So the web page cannot show in browser. The
What about ValidateCsv, could that do what you want?
Sent from my iPhone
> On Jan 6, 2020, at 6:10 PM, Shawn Weeks wrote:
>
>
> I’m poking around to see if I can make the csv parsers fail on a schema
> mismatch like that. A stream command would be a good option though.
>
> Thanks
> Shawn
Are you seeing the screwups happening consistently?
On Mon, Jan 6, 2020 at 6:10 PM Shawn Weeks
wrote:
> I’m poking around to see if I can make the csv parsers fail on a schema
> mismatch like that. A stream command would be a good option though.
>
>
>
> Thanks
>
> Shawn
>
>
>
> *From: *Mike
I’m poking around to see if I can make the csv parsers fail on a schema
mismatch like that. A stream command would be a good option though.
Thanks
Shawn
From: Mike Thomsen
Reply-To: "users@nifi.apache.org"
Date: Monday, January 6, 2020 at 4:35 PM
To: "users@nifi.apache.org"
Subject: Re:
We have a lot of the same issues where I work, and our solution is to use
ExecuteStreamCommand to pass CSVs off to Python scripts that will read
stdin line by line to check to see if the export isn't screwed up. Some of
our sources are good and we don't have to do that, but others are
minefields
That's the challenge, the values can be null but I want to know the fields are
missing(aka not enough delimiters). I run into a common scenario where line
feeds end up in the data making a short row. Currently the reader just ignores
the fact that there aren't enough delimiters and makes them
Shawn,
Your schema indicates that the fields are optional because of the
"type" : ["null", "string"] , so IIRC they won't be marked as invalid
because they are treated as null (I'm not sure there's a difference in
the code between missing and null fields).
You can try "type": "string" in
I’m trying to validate that a csv file has the number of fields defined in it’s
Avro schema. Consider the following schema and CSVs. I would like to be able to
reject the invalid csv as missing fields.
{
"type" : "record",
"namespace" : "nifi",
"name" : "nifi",
"fields" : [
{
Alec,
How large is your provenance Repository? How long did you wait before
attempting to query the repo? How powerful is the machine that you're running
on?
You are correct that NiFi must rebuild the Lucene indices before they can be
queried. This can take quite a while for a large
Hi
we use the WriteAheadProvenanceRepository implementation.
nifi.provenance.repository.implementation=org.apache.nifi.provenance.WriteAheadProvenanceRepository
Thanks,
Alec
--
Sent from: http://apache-nifi-users-list.2361937.n4.nabble.com/
Thanks Pierre!
On Mon 6 Jan 2020, 17:06 Pierre Villard,
wrote:
> Hi Emanuel,
>
> The PR is currently under review so that would not be included in NiFi
> 1.10.0 (which is already released). We recently discussed about releasing a
> new NiFi version (1.10.1 or 1.11.0) and if the PR is merged
Hi Emanuel,
The PR is currently under review so that would not be included in NiFi
1.10.0 (which is already released). We recently discussed about releasing a
new NiFi version (1.10.1 or 1.11.0) and if the PR is merged before such a
release, it would certainly be included in that version.
Hope
Thanks Matt and Mark!
We still on version
1.8.0
10/22/2018 23:48:30 EDT
Tagged nifi-1.8.0-RC3
Current version is 1.10
As curiosity, when could we expected this fix to be available ? Would it mean
we upgrade to 1.10 ? Thanks.
Thanks//Regards,
Emanuel Oliveira
-Original Message-
13 matches
Mail list logo