Github user trkurc commented on the pull request:
https://github.com/apache/nifi/pull/218#issuecomment-196112326
@apiri - I attached a patch to the ticket, which should be the last two
commits off my branch https://github.com/trkurc/nifi/commits/NIFI-1481
---
If your project is set
Github user apiri commented on the pull request:
https://github.com/apache/nifi/pull/218#issuecomment-196111289
Patch to apply on top of the PR is probably simplest, but am good with
whatever is easiest for you.
---
If your project is set up for it, you can reply to this email and
Github user trkurc commented on the pull request:
https://github.com/apache/nifi/pull/218#issuecomment-196104511
@apiri: how would you think it best to review the changes I made based on
your reviews? Another PR? A patch on the ticket?
---
If your project is set up for it, you can
Github user apiri commented on the pull request:
https://github.com/apache/nifi/pull/218#issuecomment-196102613
@trkurc I think it might be a fair concession to have both this and Windows
punted. Maybe we just roll in the check piggybacking off of the already
existing $cygwin and
Github user trkurc commented on the pull request:
https://github.com/apache/nifi/pull/218#issuecomment-196092752
@apiri - cygwin is proving to be a challenge, and not necessarily due to
the changes in this patch.
On my setup, the ':' separator on this line seems to break
Github user joewitt commented on the pull request:
https://github.com/apache/nifi/pull/272#issuecomment-196089886
i was reviewing this earlier today and frankly had a similar concern to
this as Adam. I didn't reply because I hadn't really figured out what to
think. First, I agree
Github user trkurc commented on the pull request:
https://github.com/apache/nifi/pull/218#issuecomment-196080641
After an out of band discussion with @markap14, seems that windows pids
might be challenging to get, so maybe we should leave the Windows env batch
script out for 0.6.0.
Github user taftster commented on the pull request:
https://github.com/apache/nifi/pull/272#issuecomment-196079414
I'm not entirely sure if this is a good idea. Any web service which
_disallows_ a standard HTTP header is arguably broken. Quoting RFC 2616:
> Any HTTP/1.1
I think it makes total sense that POST/PUT requests read from the flowfile
content. Therefore, the problem should be fixed further up in the flow
design. For example, try these solutions:
GenerateFlowFile -> ReplaceText -> InvokeHTTP (or)
GetFile -> InvokeHTTP
The problem you're describing
Github user apiri commented on the pull request:
https://github.com/apache/nifi/pull/224#issuecomment-196043097
@mans2singh thanks for getting this updated, will start looking over it
tonight/tomorrow
---
If your project is set up for it, you can reply to this email and have your
Devin,
We do realize that we have some work to do in order to make it so that
a single Processor can buffer up hundreds of thousands or more FlowFiles.
The SplitText processor is very popular and suffers from this exact same
problem.
We want to have a mechanism for swapping those out of the Java
Github user asfgit closed the pull request at:
https://github.com/apache/nifi/pull/274
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user asfgit closed the pull request at:
https://github.com/apache/nifi/pull/268
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Hi ThadThank you very much for your advice. Kettle can do the job for sure ,
but the metadata i was talking about is the metadata of the job descriptions
used for kettle itself. The only option left for kettle is multiple instances ,
but that also means that we need to develop a master
Yan,
Pentaho Kettle (PDI) can also certainly handle your needs. But using 10K
jobs to accomplish this is not the proper way to setup Pentaho. Also,
using MySQL to store the metadata is where you made a wrong choice.
PostgreSQL with data silos on SSD drives would be a better choice, while
15 matches
Mail list logo