Not sure. Could be a yarn configuration thing. Sqoop shouldn't be explicitly creating a directories there unless you're the yarn user.
On Wed, Apr 8, 2015 at 4:15 AM, Vishwakarma, Chhaya < [email protected]> wrote: > > > Created /user/yarn directory but why it is referring to this directory ? > > I have received a user ID from Hadoop admin that I’m using for running > Sqoop jobs > > > > Thanks , > > > > > > *From:* Abraham Elmahrek [mailto:[email protected]] > *Sent:* Tuesday, April 07, 2015 11:23 PM > > *To:* [email protected] > *Subject:* Re: Error in Sqoop workflow using Oozie > > > > Seems like you need to create the "/user/yarn" directory. Also, what user > are you running sqoop as? > > > > On Tue, Apr 7, 2015 at 3:21 AM, Vishwakarma, Chhaya < > [email protected]> wrote: > > Thanks Abe, > > > > Below is the Error I'm getting in Map Reduce logs > > > > error: error reading /usr/lib/hadoop/lib/smore.jar; > /usr/lib/hadoop/lib/smore.jar (Permission denied) > > error: error reading /usr/lib/hadoop/lib/janusclient.jar; > /usr/lib/hadoop/lib/janusclient .jar (Permission denied) > > error: error reading /usr/lib/hadoop/lib/aster- networking.jar; > /usr/lib/hadoop/lib/aster-networking.jar (Permission denied) > > error: error reading /usr/lib/hadoop/lib/adfs-api-loader.jar; > /usr/lib/hadoop/lib/adfs-api-loader.jar (Permission denied) > > error: error reading /usr/lib/hadoop/lib/aftp.jar; > /usr/lib/hadoop/lib/aftp.jar (Permission denied) > > error: error reading /usr/lib/hadoop/lib/adfs.jar; > /usr/lib/hadoop/lib/adfs.jar (Permission denied) > > Note: /tmp/sqoop > yarn/compile/f3aca99f37fa19e505b47bab1499bf5d/EXT_EVNT_ERR_ARC.java uses or > overrides a deprecated API. > > Note: Recompile with -Xlint:deprecation for details. > > Intercepting System.exit(1) > > Failing Oozie Launcher, Main class > [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] > > che.sqoop.hive.HiveImport - **Caused by: java.io.FileNotFoundException: File > does not exist: hdfs:/user/yarn**` > > Please help > > ------------------------------ > > *From:* Abraham Elmahrek [[email protected]] > *Sent:* Tuesday, April 07, 2015 2:12 AM > *To:* [email protected] > *Subject:* Re: Error in Sqoop workflow using Oozie > > A couple of thoughts: > > 1. Oozie will run the sqoop job as the user oozie possibly. So > "hive-site.xml" won't be picked up. Try putting it in a shared location and > adding a symlink: <file>/tmp/hive-site.xml#hive-site.xml</file>. The same > is true for terajdbc4.jar. > 2. The oozie launcher task logs should have more information. It will > be easier to debug if you drill down into the mapreduce job launching > Sqoop. > > -Abe > > > > On Mon, Apr 6, 2015 at 7:40 AM, Vishwakarma, Chhaya < > [email protected]> wrote: > > I have written a Sqoop import script to import data from teradata to hive. > It's working fine when I run it from command line ,but when I put it in > Oozie workflow and try to execute through oozie I get the below error > > > > > > 2015-04-02 08:50:55,440 INFO ActionEndXCommand:539 - USER[qjdht93] > GROUP[-TOKEN[] APP[sqoop-shell-wf] > JOB[0000069-150114201015959-oozie-oozi-W] > ACTION[0000069-150114201015959-oozie-oozi-W@sqoop-shell] end executor for > wf action 0000069-150114201015959-oozie-oozi-W with wf job > 0000069-150114201015959-oozie-oozi-W > > 2015-04-02 08:50:55,459 INFO ActionEndXCommand:539 - USER[qjdht93] > GROUP[-] TOKEN[] APP[sqoop-shell-wf] > JOB[0000069-150114201015959-oozie-oozi-W] > ACTION[0000069-150114201015959-oozie-oozi-W@sqoop-shell] ERROR is > considered as FAILED for SLA > > 2015-04-02 08:50:55,505 INFO ActionStartXCommand:539 - USER[qjdht93] > GROUP[-] TOKEN[] APP[sqoop-shell-wf] > JOB[0000069-150114201015959-oozie-oozi-W] > ACTION[0000069-150114201015959-oozie-oozi-W@fail] Start action > [0000069-150114201015959-oozie-oozi-W@fail] with user-retry state : > userRetryCount [0], userRetryMax [0], userRetryInterval [10] > > 2015-04-02 08:50:55,505 WARN ActionStartXCommand:542 - USER[qjdht93] > GROUP[-] TOKEN[] APP[sqoop-shell-wf] > JOB[0000069-150114201015959-oozie-oozi-W] > ACTION[0000069-150114201015959-oozie-oozi-W@fail] > [***0000069-150114201015959-oozie-oozi-W@fail***]Action status=DONE > > 2015-04-02 08:50:55,505 WARN ActionStartXCommand:542 - USER[qjdht93] > GROUP[-] TOKEN[] APP[sqoop-shell-wf] > JOB[0000069-150114201015959-oozie-oozi-W] > ACTION[0000069-150114201015959-oozie-oozi-W@fail] > [***0000069-150114201015959-oozie-oozi-W@fail***]Action updated in DB! > > 2015-04-02 08:50:55,522 INFO ActionEndXCommand:539 - USER[qjdht93] > GROUP[-] TOKEN[] APP[sqoop-shell-wf] > JOB[0000069-150114201015959-oozie-oozi-W] > ACTION[0000069-150114201015959-oozie-oozi-W@fail] end executor for wf > action 0000069-150114201015959-oozie-oozi-W with wf job > 0000069-150114201015959-oozie-oozi-W > > 2015-04-02 08:50:55,556 WARN CoordActionUpdateXCommand:542 - > USER[qjdht93] GROUP[-] TOKEN[] APP[sqoop-shell-wf] > JOB[0000069-150114201015959-oozie-oozi-W] ACTION[-] **E1100: Command > precondition does not hold before execution, [, coord action is null], > Error Code: E1100**` > > > > Below is my workflow.xml > > <workflow-app name="sqoop-to-hive" xmlns="uri:oozie:workflow:0.4"> > > <start to="sqoop2hive"/> > > <action name="sqoop2hive"> > > <sqoop xmlns="uri:oozie:sqoop-action:0.2"> > > <job-tracker>${jobTracker}</job-tracker> > > <name-node>${nameNode}</name-node> > > <command>import --connect > "jdbc:teradata://server.co/database=TS"-–driver > Com.teradata.jdbc.TeraDriver--username sqoop --password sqoop --table test > --hive-import --hive- table test</command> > > <archive>tdgssconfig.jar</archive> > > <archive>terajdbc4.jar</archive> > > <file>hive-site.xml</file> > > </sqoop> > > <ok to="end"/> > > <error to="kill"/> > > </action> > > <kill name="kill"> > > <message>Action failed</message> > > </kill> > > <end name="end"/> > > </workflow-app> > > > > Please suggest > > > > Regards, > > Chhaya > > > > >
