Hey.

Turns out I ran into this Hadoop issue: 
https://issues.apache.org/jira/browse/MAPREDUCE-6289

After applying the workaround mentioned in the comments in the issue (with the 
mapred-site.xml and yarn-site.xml files) I no longer get the error.


Paavo


From: Parkkinen Paavo [mailto:[email protected]]
Sent: Tuesday, June 16, 2015 11:33 AM
To: [email protected]
Subject: Trying to Run Job: MySQL Connector JAR File Does Not Exist

Hi.


I'm trying to set up Sqoop to move data from a MySQL database to HDFS, and I'm 
running into some problems.

I have two servers: one has the MySQL database, the other one has a single node 
set up of HDFS. MySQL works. HDFS works.

I downloaded Sqoop 1.99.6, and set up the server. I configured my 
catalina.properties to point to my Hadoop jars, and sqoop.properties to point 
to my Hadoop configuration. Finally, I copy the MySQL connector jar into 
directory lib inside the Sqoop install directory.

After that I start the server. Nothing obvious going wrong in the logs.

Next I start up the shell on the same server. I follow the 5 minute guide to 
set the server connection, create the links, and create a job. When I run the 
job, I get the following error:

2015-06-15 12:50:42 JST: FAILURE_ON_SUBMIT
Exception: java.io.FileNotFoundException: File does not exist: 
hdfs://xxx.xxx.xxx.xxx:9000/path_to_sqoop_install/server/webapps/sqoop/WEB-INF/lib/mysql-connector-java-5.1.23.jar

I grabbed the latest sqoop2 branch code from GitHub and built it to see if it 
was fixed. Instead, now I get the following:

2015-06-15 16:11:36 JST: FAILURE_ON_SUBMIT
Exception: java.io.FileNotFoundException: File does not exist: 
hdfs://xxx.xxx.xxx.xxx:9000/path_to_sqoop_install/server/webapps/sqoop/WEB-INF/lib/sqoop-common-2.0.0-SNAPSHOT.jar


Some people online suggest copying the jar files into HDFS manually, but I 
haven't done that as it's not mentioned in the docs, and the fact that it's 
pointing to the Sqoop install path on my server feels a little suspect to me.


Should I just place the files in HDFS (the documentation should probably be 
updated to reflect this), or is there another way to fix this?


Thanks,

Paavo

Reply via email to