Hey Dan,

Until https://issues.apache.org/jira/browse/OOZIE-477 goes in, I don't
believe you can run off of local FS.

I usually solve this by bootstrapping some FS-API calls/commands
before re-running the Oozie workflow.

Something like the following one liner in bash history:

$ hadoop fs -rmr examples/apps/java-app; hadoop fs -put java-app
examples/apps; oozie job -run java-app/job.properties

On Tue, May 1, 2012 at 10:06 AM, Dan Feldman <[email protected]> wrote:
> I was wondering if there is something analogous to pig's local mode for
> oozie? For instance, I couldn't get oozie working while my workflow.xml and
> job.properties files were sitting on the local system - I was getting
> "wrong FS" error just like the OP here
> https://groups.google.com/a/cloudera.org/group/scm-users/browse_thread/thread/c96a6a6149e1e86e/c399307616b920f0?#c399307616b920f0.
> But now that the files are on the hdfs, it seems like I have to delete and
> re-upload them to hdfs every time I need to make modifications to any of
> the files.
>
> I'm hoping to use oozie to automate analysis of cassandra data using a
> bunch of pig and ruby scripts and as of now, local mode is more than enough
> to handle that...
>
> Thanks!
> Dan



-- 
Harsh J

Reply via email to