Yeah, I see it. I found the way to test wokflow locally but it's suuuper complex. I have to start local MR, Local HDFS and Local OOzie things. Then I do mock on the fly xml actions with my test actions and run workflow. It's super complex and fragile unfortunately... I'll try to reach dev group. I found LiteWorkflow thing, it seems to be the engine that execute nodes and it's super lightweight.
2016-12-05 14:11 GMT+01:00 Andras Piros <andras.pi...@cloudera.com>: > Hi Serega, > > as per *Oozie documentation > <http://oozie.apache.org/docs/4.1.0/DG_CommandLineTool.html# > Dryrun_of_Workflow_Job>* > we > can see that with -dryrun option does not create nor run a job. > > So for the killer feature request, I think it's not possible ATM. > > Regards, > > Andras > > -- > Andras PIROS > Software Engineer > <http://www.cloudera.com/> > > On Thu, Dec 1, 2016 at 8:33 PM, Serega Sheypak <serega.shey...@gmail.com> > wrote: > > > Hi, did anyone make it work property in his project? > > I need to do dry run for my workflows. > > The usecase is: > > User writes workflow and wants to: > > 1. Check if it valid > > 2. do dryrun, see how it flows without executing steps. > > > > Let say I have wflow with three steps: > > > > 1. disctp data from $A to $B > > 2. run spark action with $B as input > > 3. disctp $B to $C > > > > I want to do dryrun and check how my variables were interpolated it > wflow. > > The killer feature is: I want to imitate spark action failure and check > how > > my kill node looks like. > > >