Hi Peter, If you are using Oozie from Apache, you need to set up Oozie with few things before Falcon can orchestrate workflows in Oozie.
You need the following changes in Oozie: * add the EL expr in oozie-site.xml apply this patch to oozie-site.xml - src/patches/oozie-site.patch * add falcon-oozie-el-expressions jar into oozie This is in the target folder of falcon-oozie-el-extension module * restart oozie On Wed, Jun 11, 2014 at 5:58 AM, Jean-Baptiste Onofré <[email protected]> wrote: > Yes it is. > > Regards > JB > > > On 06/11/2014 02:57 PM, pmanolov wrote: > >> Hi Jean-Baptiste >> I am building against the master @ >> https://git-wip-us.apache.org/repos/asf/incubator-falcon . I assume >> that's the latest falcon code ? >> >> Peter >> >> On 11.06.2014 15:44, Jean-Baptiste Onofré wrote: >> >>> Hi Peter, >>> >>> thanks for the update. >>> >>> Do you use the latest Falcon master (or a released incubating version) ? >>> I will try on my machine. >>> >>> I keep you posted. >>> >>> Regards >>> JB >>> >>> On 06/11/2014 02:42 PM, pmanolov wrote: >>> >>>> Hi I've had read your blog post ( great blog post btw ), but for some >>>> reason the packaging doesn't work. Although the build states that the >>>> distro tar.gz is created none is present in the target directory. >>>> I've seen https://issues.apache.org/jira/browse/FALCON-409 , but my >>>> build isn't failing. >>>> I've tried both building using /./src/bin/package 2.4.0 4.0.1/ (I am >>>> using hadoop-2.4.0 ) and the /./src/bin/package.sh 1.1.2 4.0.0 (/as the >>>> blog post suggests/)/ >>>> Both builds were successful, but no oozie-distro file was generated. Has >>>> anyone had a similar problem ? I am building on a 64 bit Ubuntu 14.04 >>>> with Oracle JDK 6 and maven 3.0.3 . I've even run the builds as root but >>>> the result was the same. >>>> >>>> Regards, >>>> Peter >>>> >>>> On 10.06.2014 20:03, Jean-Baptiste Onofré wrote: >>>> >>>>> Hi, >>>>> >>>>> You have to build and use the Oozie distribution provided by Falcon. >>>>> >>>>> I explained how to do that in a blog: >>>>> >>>>> http://blog.nanthrax.net/2014/03/hadoop-cdc-and-processes- >>>>> notification-with-apache-falcon-apache-activemq-and-apache-camel/ >>>>> >>>>> >>>>> >>>>> " >>>>> We can clone Falcon sources from git and call the src/bin/package.sh >>>>> with the Hadoop and Oozie target versions that we want: >>>>> >>>>> $ git clone https://git-wip-us.apache.org/repos/asf/incubator-falcon >>>>> falcon >>>>> $ cd falcon >>>>> $ src/bin/package.sh 1.1.2 4.0.0 >>>>> >>>>> The package.sh script creates target/oozie-4.0.0-distro.tar.gz in the >>>>> Falcon sources folder. >>>>> >>>>> In the demo folder, I uncompress oozie-4.0.0-distro.tar.gz tarball: >>>>> >>>>> $ cp ~/oozie-4.0.0-distro.tar.gz >>>>> $ tar zxvf oozie-4.0.0-distro.tar.gz >>>>> " >>>>> >>>>> I created a couple of Jira to improve the packaging and publication of >>>>> the Oozie distribution. >>>>> >>>>> Regards >>>>> JB >>>>> >>>>> On 06/10/2014 06:54 PM, pmanolov wrote: >>>>> >>>>>> Hi Guys, >>>>>> I am not sure if this is the correct place to ask this question, but >>>>>> here I go. >>>>>> I've been trying to create a simple oozie workflow job through falcon. >>>>>> The job does get created and I can see it in the oozie management >>>>>> console, but it is always failing with the same exception: >>>>>> >>>>>> 2014-06-10 13:50:10,407 WARN CoordSubmitXCommand:542 - USER[-] >>>>>> GROUP[-] >>>>>> TOKEN[-] APP[-] JOB[0000000-140610134923473-oozie-root-B] ACTION[-] >>>>>> ERROR: >>>>>> org.apache.oozie.coord.CoordinatorJobException: E1004: Expression >>>>>> language evaluation error, Unable to evaluate :${now(0,-5)}: >>>>>> .................................. >>>>>> >>>>>> Caused by: java.lang.Exception: Unable to evaluate :${now(0,-5)}: >>>>>> >>>>>> at >>>>>> org.apache.oozie.coord.CoordELFunctions.evalAndWrap( >>>>>> CoordELFunctions.java:691) >>>>>> >>>>>> >>>>>> >>>>>> at >>>>>> org.apache.oozie.command.coord.CoordSubmitXCommand. >>>>>> resolveTagContents(CoordSubmitXCommand.java:885) >>>>>> >>>>>> >>>>>> >>>>>> ... 11 more >>>>>> Caused by: javax.servlet.jsp.el.ELException: No function is mapped to >>>>>> the name "now" >>>>>> at org.apache.commons.el.Logger.logError(Logger.java:481) >>>>>> at org.apache.commons.el.Logger.logError(Logger.java:498) >>>>>> at org.apache.commons.el.Logger.logError(Logger.java:525) >>>>>> at >>>>>> org.apache.commons.el.FunctionInvocation.evaluate( >>>>>> FunctionInvocation.java:150) >>>>>> >>>>>> >>>>>> >>>>>> at >>>>>> org.apache.commons.el.ExpressionEvaluatorImpl.evaluate( >>>>>> ExpressionEvaluatorImpl.java:263) >>>>>> >>>>>> >>>>>> >>>>>> at >>>>>> org.apache.commons.el.ExpressionEvaluatorImpl.evaluate( >>>>>> ExpressionEvaluatorImpl.java:190) >>>>>> >>>>>> >>>>>> >>>>>> at >>>>>> org.apache.oozie.util.ELEvaluator.evaluate(ELEvaluator.java:203) >>>>>> at >>>>>> org.apache.oozie.coord.CoordELFunctions.evalAndWrap( >>>>>> CoordELFunctions.java:682) >>>>>> >>>>>> >>>>>> >>>>>> ... 12 more >>>>>> >>>>>> I've been following the tutorials on Hortonworks ( >>>>>> http://public-repo-1.hortonworks.com/HDP-LABS/ >>>>>> Projects/Falcon/2.0.6.0-76/FalconHortonworksTechnicalPreview.pdf >>>>>> >>>>>> >>>>>> ) >>>>>> I've followed the instructions to the letter, especially those >>>>>> regarding >>>>>> the falcon-el extensions. I've added the properties to the >>>>>> oozie-site.xml as well as uploading the libs to the HDFS using the >>>>>> oozie >>>>>> /oozie-setup.sh sharelib create /command. However that didn't solve >>>>>> the >>>>>> issue. I am really stuck guys, I have no clue what else needs to be >>>>>> done, I am not sure why the error persists despite me following the >>>>>> official instructions. >>>>>> >>>>>> Peter >>>>>> >>>>>> P.S. >>>>>> >>>>>> Here is my process.xml definition >>>>>> >>>>>> <?xml version="1.0" encoding="UTF-8"?> >>>>>> <process name="dev-process" xmlns="uri:falcon:process:0.1"> >>>>>> <clusters> >>>>>> <cluster name="dev-cluster"> >>>>>> <validity start="2013-11-15T00:05Z" >>>>>> end="2030-11-15T01:05Z"/> >>>>>> </cluster> >>>>>> </clusters> >>>>>> >>>>>> <parallel>5</parallel> >>>>>> <order>FIFO</order> >>>>>> <frequency>hours(1)</frequency> >>>>>> <timezone>UTC</timezone> >>>>>> >>>>>> <inputs> >>>>>> <!-- In the workflow, the input paths will be available in a >>>>>> variable 'inpaths' --> >>>>>> <input name="s1" feed="hadoop-s1" start="now(0,-5)" >>>>>> end="now(0,-1)" /> >>>>>> <input name="s2" feed="hadoop-s2" start="now(0,-5)" >>>>>> end="now(0,-1)" /> >>>>>> <input name="s3" feed="hadoop-s3" start="now(0,-5)" >>>>>> end="now(0,-1)" /> >>>>>> </inputs> >>>>>> >>>>>> <outputs> >>>>>> <!-- In the workflow, the output path will be available in a >>>>>> variable 'outpath' --> >>>>>> <output name="outpath" feed="transformedData" >>>>>> instance="now(0,0)"/> >>>>>> </outputs> >>>>>> >>>>>> <properties> >>>>>> <!-- In the workflow, these properties will be available with >>>>>> variable - key --> >>>>>> <property name="queueName" value="default"/> >>>>>> <!-- The schedule time available as a property in workflow >>>>>> --> >>>>>> <property name="time" value="${instanceTime()}"/> >>>>>> <property name="oozie.wf.workflow.notification.url" >>>>>> value="http://hadoop-ui:8080/modataui/update/worflow/ >>>>>> status/$jobId/$status"/> >>>>>> >>>>>> >>>>>> >>>>>> <property name="oozie.wf.action.notification.url" >>>>>> value="http://hadoop-ui:8080/modataui//update/action/ >>>>>> status/$jobId/$nodename/$status"/> >>>>>> >>>>>> >>>>>> >>>>>> </properties> >>>>>> >>>>>> <workflow engine="oozie" path="/root/app/mr"/> >>>>>> >>>>>> <!-- >>>>>> <late-process policy="periodic" delay="minutes(1)"> >>>>>> <late-input input="inpaths" workflow-path="/app/mr"/> >>>>>> </late-process> >>>>>> --> >>>>>> </process> >>>>>> >>>>>> >>>>> >>>> >>>> >>> >> > -- > Jean-Baptiste Onofré > [email protected] > http://blog.nanthrax.net > Talend - http://www.talend.com > -- Regards, Venkatesh “Perfection (in design) is achieved not when there is nothing more to add, but rather when there is nothing more to take away.” - Antoine de Saint-Exupéry
