Hi.

If I run this sqoop import as an import only (no job) and remove the
incremental, I can use target-dir with hdfs.

In fact I notice that if I don't use any incremental, I can use tartget-dir
with hdfs... (job or normal import)
On Jul 2, 2013 11:47 PM, "Jarek Jarcec Cecho" <[email protected]> wrote:

> Hi sir,
> Sqoop requires Hadoop configuration files available on the machine where
> you run Sqoop. I'm wondering if the config files from machine "C" (hdfs
> gateway I suppose) are also available on machine "A" where Sqoop is running.
>
> Jarcec
>
> On Tue, Jul 02, 2013 at 07:02:41PM +0900, corbacho anthony wrote:
> > Hi!
> >
> > I am trying to create a sqoop job with incremental option.
> > I want to save it into my hdfs, so I use the option --target-dir,
> > but sqoop throw me an error: tool.ImportTool: Imported Failed: Wrong FS:
> > hdfs://my.hdfs.com:54310/job_import_incrt, expected: file:///
> >
> > My sqoop job:
> > sqoop job --verbose --create job_import_0 -- import --connect
> jdbc:mysql://
> > db.mysql.com:3306/DB --table TABLE_TEST --target-dir hdfs://
> > my.hdfs.com:54310/db_import --username xxx --password xxx --incremental
> > append --check-column id --last-value 1
> >
> > I run sqoop on a machine A, I have sqoop-metastore on a machine B and my
> > hdfs on a machine C.
> >
> > What should I do to "force scoop" to save it into my hdfs and not on my
> > local machine?
> >
> > PS: If i change --target-dir with a local directory, its work like a
> charm.
> >
> > Thank you
> > Anthony
>

Reply via email to