Thanks Abe. This clarifies all my doubts. Now I have sufficient information :)
On Mon, May 11, 2015 at 1:46 PM, Abraham Elmahrek <[email protected]> wrote: > It should. Those Fifo files are instructed to delete on exit. So when > Sqoop CLI finishes, it should clean up after itself. > > -Abe > > On Sat, May 9, 2015 at 2:31 AM, Suraj Nayak <[email protected]> wrote: > >> Thanks Abraham. >> >> So, once direct sqoop import completes, the external table (which is >> created as fifo file) is dropped automatically? >> >> On Fri, May 8, 2015 at 5:30 PM, Abraham Elmahrek <[email protected]> >> wrote: >> >>> You should be able to a normal table with the --direct option. >>> >>> The external table is created as a a fifo file. The file is then put >>> directly into Hadoop. This is much faster than transferring on a record >>> by >>> record basis. >>> >>> On Fri, May 8, 2015 at 1:51 PM, Suraj Nayak <[email protected]> wrote: >>> >>> > Thanks Venkat, >>> > >>> > Two more questions(As am totally not aware of the internals, am >>> interested >>> > to know the process): >>> > >>> > - Just wanted to reconfirm, If the table in netezza is not created >>> as >>> > external table, then can't we use --direct option? Will sqoop try >>> to create >>> > a replica of table as external table for transferring data in >>> direct mode? >>> > - This external table, which is created by sqoop on --direct option, >>> >>> > is used only to store meta information or all the table data is >>> copied into >>> > external table? >>> > >>> > As my user id don't have write access am using non direct mode to >>> import >>> > data now. No issues in getting the data into HDFS from Netezza in this >>> case. >>> > >>> > Thanks! >>> > >>> > On Fri, May 8, 2015 at 11:40 AM, Venkat Ranganathan < >>> > [email protected]> wrote: >>> > >>> >> Yes. —-direct mode says that Sqoop should use external table to do >>> the >>> >> import. If you don’t want to create external tables, use without >>> —direct >>> >> >>> >> See the Sqoop User document on Netezza support >>> >> >>> >> Thanks >>> >> >>> >> Venkat >>> >> >>> >> >>> >> >>> >> >>> >> On 5/6/15, 5:52 AM, "Suraj Nayak" <[email protected]> wrote: >>> >> >>> >> >Hi Sqoop Users and Developers, >>> >> > >>> >> >Am trying to run Sqoop import using direct mode on Netezza 7.2 using >>> >> Sqoop >>> >> >1.4.4 HDP-2.1.2.0. >>> >> > >>> >> >export HADOOP_CLASSPATH=/path/to/nzjdbc.jar >>> >> > >>> >> >Sqoop Command : sqoop import -D mapreduce.job.queuename=some_queue >>> >> >--connect jdbc:netezza://some.server.com:5480/DB1 --username abcd >>> >> >--password xxxxxx --table TABLE_1 --target-dir >>> /tmp/NETEZZA_DIRECT_IMPORT >>> >> >--direct -m 1 >>> >> > >>> >> >The job never fails and runs endlessly. When I looked into the logs >>> of >>> >> >running map task in resource manager, below is what I found: >>> >> > >>> >> >2015-05-06 07:21:48,159 ERROR [Thread-15] >>> >> >org.apache.sqoop.mapreduce.db.netezza.NetezzaJDBCStatementRunner: >>> >> >Unable to execute external table export >>> >> >org.netezza.error.NzSQLException: ERROR: CREATE EXTERNAL TABLE: >>> >> >permission denied. >>> >> > >>> >> > at >>> >> >>> org.netezza.internal.QueryExecutor.getNextResult(QueryExecutor.java:279) >>> >> > at >>> >> org.netezza.internal.QueryExecutor.execute(QueryExecutor.java:73) >>> >> > at >>> org.netezza.sql.NzConnection.execute(NzConnection.java:2715) >>> >> > at org.netezza.sql.NzStatement._execute(NzStatement.java:849) >>> >> > at >>> >> >>> org.netezza.sql.NzPreparedStatament.execute(NzPreparedStatament.java:152) >>> >> > at >>> >> >>> org.apache.sqoop.mapreduce.db.netezza.NetezzaJDBCStatementRunner.run(NetezzaJDBCStatementRunner.java:75) >>> >> > >>> >> > >>> >> >Why is the user trying to create External table? Is this how direct >>> >> import >>> >> >works? So the user running direct import should have create external >>> >> table >>> >> >access into Netezza DB? >>> >> > >>> >> >Any pointers and suggestions are well appreciated! >>> >> > >>> >> >-- >>> >> >Thanks >>> >> >Suraj Nayak M >>> >> >>> > >>> > >>> > >>> > -- >>> > Thanks >>> > Suraj Nayak M >>> > >>> >> >> >> >> -- >> Thanks >> Suraj Nayak M >> > > -- Thanks Suraj Nayak M
