> hadoop fs -rm -r -f ./euc_inform_temp

I believe that this command will move the directory into trash by default 
unless you use -skipTrash option.

Jarcec

On Mon, Apr 21, 2014 at 03:35:06PM +0000, Sandipan.Ghosh wrote:
> Hi,
> 
> I have shell script to run a sqoop,
> 
> #!/bin/bash
> echo "doing cleanup"
> hadoop fs -rm -r -f ./euc_inform_temp
> echo "done cleanup"
> sqoop --options-file /home_dir/z070061/sqoop_import_param_td.txt 
> --fields-terminated-by ',' --query "select * from INOVBIDT.PROD_ACCTDATE 
> where 1=1 and \$CONDITIONS" -m 1 --target-dir ./euc_inform_temp
> echo " sqoop export done"
> 
> Once I run it data extracted from sqoop end up in .trash
> 
> This is very strange. Does anyone know how to resolve this?
> 
> Thanks
> Sandipan

Attachment: signature.asc
Description: Digital signature

Reply via email to