Hi,

I am a n00b at NiFi although I have worked with Storm; currently that is
how we handle data flow logic.

I am evaluating using NiFi for logging/auditing use case (might move to
other uses if this works) in a ~50 machine cluster, and am thinking if it
would be a good fit to ingest messages from various sources and spit out
the graph of how a message was handled at each stage, by each of the
services.

Anyway, I was wondering how do we feed data into a NiFi cluster to get the
first stage started. Also, how does the data exit the system, if I want it
to be forwarded to another service after NiFi is done with its processing
[similar to spouts and bolts in apache storm].

Thanks.
--
κρισhναν

Reply via email to