Hdfs uses three slashes:

hdfs:///someDirectory



On Jul 2, 2013, at 5:03 AM, "corbacho anthony" 
<[email protected]<mailto:[email protected]>> wrote:

Hi!

I am trying to create a sqoop job with incremental option.
I want to save it into my hdfs, so I use the option --target-dir,
but sqoop throw me an error: tool.ImportTool: Imported Failed: Wrong FS: 
hdfs://my.hdfs.com:54310/job_import_incrt<http://my.hdfs.com:54310/job_import_incrt>,
 expected: file:///

My sqoop job:
sqoop job --verbose --create job_import_0 -- import --connect 
jdbc:mysql://db.mysql.com:3306/DB<http://db.mysql.com:3306/DB> --table 
TABLE_TEST --target-dir 
hdfs://my.hdfs.com:54310/db_import<http://my.hdfs.com:54310/db_import> 
--username xxx --password xxx --incremental append --check-column id 
--last-value 1

I run sqoop on a machine A, I have sqoop-metastore on a machine B and my hdfs 
on a machine C.

What should I do to "force scoop" to save it into my hdfs and not on my local 
machine?

PS: If i change --target-dir with a local directory, its work like a charm.

Thank you
Anthony
************************************************************************ The 
information contained in this e-mail is confidential and may be privileged. It 
is intended only for the use of the addressee(s) named herein. If you are not 
the intended recipient, be aware that any disclosure, dissemination, 
distribution, copying, or use of the contents of this message or its 
attachments is strictly prohibited. If you received this message in error, 
please notify the sender immediately by return e-mail and delete the original 
message from your system.

Reply via email to