Hello,

I think NiFi can do most of the things you are looking to do, although NiFi
is not built to be a workflow orchestration tool, many of its features do
allow for the type of orchestration you are describing. I suggest taking a
look at all of the available processors [1] and reading the documentation
to see which processors make sense for your use-cases. There is also a
repository of templates [2] that show many common examples.

Thanks,

Bryan

[1] https://nifi.apache.org/docs.html
[2] https://github.com/hortonworks-gallery/nifi-templates

On Sat, Aug 20, 2016 at 2:39 AM, <manikan...@pinnacleseven.com> wrote:

> Hi Bryan,
>
> Thanks for your email!!
>
> I need to set  NiFi as the main orchestrator for all data movement and
> transformation in my project .  It would call Sqoop, call an API, call a
> message bus (like Kafka or whatever) and handle scheduling – or even
> integrate with an external scheduling program.It would be a single place
> where we can easily see everywhere data is going , schedule it and monitor
> it.  I think it will make management much easier. I can put a CSV file
> and I can use NiFi to watch the folder, and then grab the file and stick it
> in Hive or HDFS .If this possible means how the data flow its working on
> here please give some suggestions that more helpful for me to understand .
>
>
> *Thanks & Regards,*
>
> *Manikandan Kolanjinathan*
> Junior Software Engineer
> Pinnacle Seven Technologies
> 0422-4208736, 4506535
> www.pinnacleseven.com
> Delivering your business apps on cloud!
> [image: Twitter]  <https://htmlsig.com/t/0000001BSFW6X> [image: Facebook]
> <https://htmlsig.com/t/0000001C10YPK> [image: LinkedIn]
> <https://htmlsig.com/t/0000001C1DD53>
>
>
>
> ------------------------------
> *From:* Bryan Bende <bbe...@gmail.com>
> *To:* dev@nifi.apache.org; manikan...@pinnacleseven.com
> *Sent:* Thursday, 18 August 2016 8:35 PM
> *Subject:* Re: Apache NiFi Clarification
>
> Hello,
>
> NiFi can gather data from as many sources as needed. When data is brought
> into NiFi it is written into NiFi's internal repositories and stored there
> as it moves through the graph of processors, the data can then be delivered
> to any desired system.
>
> To bring data from a database to HDFS you would likely use the ExecuteSQL
> or QueryDatabase processor to get the data out of the database, then some
> intermediary processors if you need to convert the data, and then PutHDFS
> to write the data to HDFS.
>
> Thanks,
>
> Bryan
>
> On Thu, Aug 18, 2016 at 6:17 AM, <manikan...@pinnacleseven.com> wrote:
>
> Hi,
> Good Day!
> I Saw your Blogs and Videos its Pretty Interesting and I am new in Apache
> Nifi.So i just want to know if the data will only to fetch into nifi or
> else data can export to any other databases or API. I need to your guides I
> am working on Nifi in My organization So i just to know how the flow to
> connect with Sql Server Or Any other Databases and How can i create a data
> flow to fetch the Server into Hadoop .but In my case its pretty different
> we need to connect multiple API Data into Nifi and Transfer Via Sqoop To
> Hive .If in this case we can choose Nifi  . please give some guides about
> the Apache Nifi and how the flow it will work . I am new to Apache Nifi So
> Please give Guides about Apache Nifi in my case  Thanks &
> Regards,Manikandan Kolanjinathan
> Junior Software Engineer
> Pinnacle Seven Technologies 0422-4208736, 4506535 www.pinnacleseven.
> comDelivering your business apps on cloud!
>
>
>
>
>
>

Reply via email to