Another processor that may be of interest to you is the QueryDatabaseTable 
processor, which has just been released in 0.6.0. This provides incremental 
load capabilities similar to sqoop.

If you’re looking for the schema type functionality, bear in mind that the 
ExecuteSQL (and new Query processor) preserve schema with Avro.

Sqoop also allows import to HBase, which you can do with PutHBaseJson (use the 
ConvertAvroToJson processor to feed this).

Distributed partitoned queries isn’t in there yet, but I believe is on the way, 
so sqoop may have the edge for that use case today.

Granted, NiFi doesn’t have much by way of HCatalog integration at the moment, 
but most of the functionality you’ll find in Sqoop is in NiFi. Unless you are 
looking to move terabytes at a time, then NiFi should be able to handle most of 
what you would use sqoop for, so it would be very interesting to hear more 
detail on your use case, and why you needed sqoop on top of NiFi.

Simon

On 29 Mar 2016, at 09:06, prabhu Mahendran 
<prabhuu161...@gmail.com<mailto:prabhuu161...@gmail.com>> wrote:

Hi,

Yes, In my case i have created the Custom processor with Sqoop API which 
accommodates complete functionality of sqoop.
As per you concern we have able to move the data only from HDFS to SQl or Vice 
versa, But sqoop having more functionality which we can achieve it by 
Sqoop.RunTool() in org.apache.sqoop.sqoop. The Sqoop Java client will works 
well and Implement that API into new Sqoop NIFI processor Doesn't work!

On Tue, Mar 29, 2016 at 12:49 PM, Conrad Crampton 
<conrad.cramp...@secdata.com<mailto:conrad.cramp...@secdata.com>> wrote:
Hi,
If you could explain exactly what you are trying to achieve I.e. What part of 
the data pipeline you are looking to use NiFi for and where you wish to retain 
Sqoop I could perhaps have a more informed input (although I have only been 
using NiFi myself for a few weeks). Sqoop obviously can move the data from RDBM 
systems through to HDFS (and vice versa) as can NiFi, not sure why you would 
want the mix (or at least I can’t see it from the description you have provided 
thus far).
I have limited knowledge of Sqoop, but either way, I am sure you could ‘drive’ 
Sqoop from a custom NiFi processor if you so choose, and you can ‘drive’ NiFi 
externally (using the REST api) - if Sqoop can consume it.
Regards
Conrad


From: prabhu Mahendran <prabhuu161...@gmail.com<mailto:prabhuu161...@gmail.com>>
Reply-To: "users@nifi.apache.org<mailto:users@nifi.apache.org>" 
<users@nifi.apache.org<mailto:users@nifi.apache.org>>
Date: Tuesday, 29 March 2016 at 07:55
To: "users@nifi.apache.org<mailto:users@nifi.apache.org>" 
<users@nifi.apache.org<mailto:users@nifi.apache.org>>
Subject: Re: Sqoop Support in NIFI

Hi Conrad,

Thanks for Quick Response.

Yeah.Combination of Execute SQL and Put HDFS works well instead of Sqoop.But is 
there any possible to use Sqoop(client) to do like this?

Prabhu Mahendran

On Tue, Mar 29, 2016 at 12:04 PM, Conrad Crampton 
<conrad.cramp...@secdata.com<mailto:conrad.cramp...@secdata.com>> wrote:
Hi,
Why use sqoop at all? Use a combination of ExecuteSQL [1] and PutHDFS [2].
I have just replace the use of Flume using a combination of ListenSyslog and 
PutHDFS which I guess is a similar architectural pattern.
HTH
Conrad


http://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.ExecuteSQL/index.html
 [1]
http://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.hadoop.PutHDFS/index.html
 [2]

From: prabhu Mahendran <prabhuu161...@gmail.com<mailto:prabhuu161...@gmail.com>>
Reply-To: "users@nifi.apache.org<mailto:users@nifi.apache.org>" 
<users@nifi.apache.org<mailto:users@nifi.apache.org>>
Date: Tuesday, 29 March 2016 at 07:27
To: "users@nifi.apache.org<mailto:users@nifi.apache.org>" 
<users@nifi.apache.org<mailto:users@nifi.apache.org>>
Subject: Sqoop Support in NIFI

Hi,

I am new to nifi.

       I have to know that  "Is there is any Support for Sqoop with help of 
NIFI Processors?."

And in which way to done the following case with help of Sqoop.

    Move data from oracle,SqlServer,MySql into HDFS and vice versa.


Thanks,
Prabhu Mahendran





***This email originated outside SecureData***

Click here<https://www.mailcontrol.com/sr/MZbqvYs5QwJvpeaetUwhCQ==> to report 
this email as spam.


SecureData, combating cyber threats

________________________________

The information contained in this message or any of its attachments may be 
privileged and confidential and intended for the exclusive use of the intended 
recipient. If you are not the intended recipient any disclosure, reproduction, 
distribution or other dissemination or use of this communications is strictly 
prohibited. The views expressed in this email are those of the individual and 
not necessarily of SecureData Europe Ltd. Any prices quoted are only valid if 
followed up by a formal written quote.

SecureData Europe Limited. Registered in England & Wales 04365896. Registered 
Address: SecureData House, Hermitage Court, Hermitage Lane, Maidstone, Kent, 
ME16 9NT



Reply via email to