I'm wondering if you are indeed calling the right binary. I would assume that after downloading and unwrapping the tarball you would call ./bin/sqoop (relative path to the tarball) and not the system-wide installation by calling just "sqoop" (no path specified).
Jarcec On Wed, Mar 26, 2014 at 02:33:53PM +0530, chandra kant wrote: > hi, > I re-downloaded "sqoop-1.4.4.bin__hadoop-1.0.0.tar.gz." and ran command - > " sqoop import --connect jdbc:mysql://localhost/checking --table testtble > -m 1" .Here is the output- > 14/03/26 08:57:45 INFO manager.MySQLManager: Preparing to use a MySQL > streaming resultset. > 14/03/26 08:57:45 INFO tool.CodeGenTool: Beginning code generation > 14/03/26 08:57:46 INFO manager.SqlManager: Executing SQL statement: SELECT > t.* FROM `testtble` AS t LIMIT 1 > 14/03/26 08:57:46 INFO manager.SqlManager: Executing SQL statement: SELECT > t.* FROM `testtble` AS t LIMIT 1 > 14/03/26 08:57:46 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is > /home/ckanth/files/hadoop-1.2.1 > Note: > /tmp/sqoop-ckanth/compile/4c722c797618961f802c8b71bffd0087/testtble.java > uses or overrides a deprecated API. > Note: Recompile with -Xlint:deprecation for details. > 14/03/26 08:57:49 INFO orm.CompilationManager: Writing jar file: > /tmp/sqoop-ckanth/compile/4c722c797618961f802c8b71bffd0087/testtble.jar > 14/03/26 08:57:49 WARN manager.MySQLManager: It looks like you are > importing from mysql. > 14/03/26 08:57:49 WARN manager.MySQLManager: This transfer can be faster! > Use the --direct > 14/03/26 08:57:49 WARN manager.MySQLManager: option to exercise a > MySQL-specific fast path. > 14/03/26 08:57:49 INFO manager.MySQLManager: Setting zero DATETIME behavior > to convertToNull (mysql) > 14/03/26 08:57:49 INFO mapreduce.ImportJobBase: Beginning import of testtble > 14/03/26 08:57:50 INFO mapred.JobClient: Cleaning up the staging area > hdfs://localhost:9000/tmp/hadoop-ckanth/mapred/staging/ckanth/.staging/job_201403211217_0008 > Exception in thread "main" java.lang.IncompatibleClassChangeError: Found > class org.apache.hadoop.mapreduce.JobContext, but interface was expected > at > org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:53) > at > com.cloudera.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:36) > at > org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:121) > at > org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1054) > at > org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1071) > at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179) > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983) > at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:416) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190) > at > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936) > at org.apache.hadoop.mapreduce.Job.submit(Job.java:550) > at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580) > at > org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186) > at > org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159) > at > org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:239) > at > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:600) > at > org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118) > at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413) > at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502) > at org.apache.sqoop.Sqoop.run(Sqoop.java:145) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181) > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220) > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229) > at org.apache.sqoop.Sqoop.main(Sqoop.java:238) > > > > On Tue, Mar 25, 2014 at 8:53 PM, Jarek Jarcec Cecho <[email protected]>wrote: > > > Hi Chandra, > > for Hadoop 1.2.1 you should use archive > > sqoop-1.4.4.bin__hadoop-1.0.0.tar.gz. > > > > Would you mind re-downloading the archive, running Sqoop and sharing > > entire output generated with parameter --verbose? > > > > Jarcec > > > > On Tue, Mar 25, 2014 at 07:39:55PM +0530, chandra kant wrote: > > > Hi, > > > I am using hadoop--1.2.1 . I tried sqoop-1.4.4.bin__hadoop-1.0.0 > > > , sqoop-1.4.4.bin__hadoop-0.23 , sqoop-1.4.4.bin__hadoop-0.20 for > > > importing mysql database, but none of them worked. > > > It shows error - "Exception in thread "main" > > > java.lang.IncompatibleClassChangeError: Found class > > > org.apache.hadoop.mapreduce.JobContext, but interface was expected" > > ,which > > > I read is a version mismatching issue. > > > Can you pls suggest which sqoop version to use for hadoop-1.2.1? > > > > > > -- > > > Chandra kant > >
signature.asc
Description: Digital signature
