Thanks a lot for the reply. I tried with the skipTrash option, but still the sqoop import end up in trash. What's happening, even I delete the dir " euc_inform_temp" with skiptrash, then I run sqoop import the import , after sqoop import I can't see the dir " euc_inform_temp" in HDFS, but I can see that in Trash(HDFS).
Any idea? -----Original Message----- From: Jarek Jarcec Cecho [mailto:[email protected]] Sent: Monday, May 05, 2014 3:09 AM To: [email protected] Subject: Re: Sqoop shell script data goes to .trash > hadoop fs -rm -r -f ./euc_inform_temp I believe that this command will move the directory into trash by default unless you use -skipTrash option. Jarcec On Mon, Apr 21, 2014 at 03:35:06PM +0000, Sandipan.Ghosh wrote: > Hi, > > I have shell script to run a sqoop, > > #!/bin/bash > echo "doing cleanup" > hadoop fs -rm -r -f ./euc_inform_temp > echo "done cleanup" > sqoop --options-file /home_dir/z070061/sqoop_import_param_td.txt > --fields-terminated-by ',' --query "select * from INOVBIDT.PROD_ACCTDATE > where 1=1 and \$CONDITIONS" -m 1 --target-dir ./euc_inform_temp > echo " sqoop export done" > > Once I run it data extracted from sqoop end up in .trash > > This is very strange. Does anyone know how to resolve this? > > Thanks > Sandipan
