Frank, A simple hack would be this:
Write up a simple driver class as you would for regular, non-Oozie MR code, configure your avro job as normal, and then instead of having the submit API bits at the end, try: jobConf.writeXml(outputStream); This will give you a simple job config XML dump with the configs avro has put in, and you can now port them into your <map-reduce> action as config elements/etc.. Should be fairly trivial to port. Will this work for you? Please do share a template MR avro job workflow on the list if you've got one in the end, in spirit of http://xkcd.com/979/ :) (Hat tip to Alejandro from Oozie for the tip) On Fri, Jan 27, 2012 at 1:06 AM, Frank Grimes <frankgrime...@gmail.com> wrote: > Hi All, > > We're trying to evaluate using Oozie (http://incubator.apache.org/oozie/) to > run Hadoop MapReduce jobs over Avro data. > > As far as I can tell, Oozie configures the JobConf it submits to Hadoop > based on external config files. > see e.g. > http://mail-archives.apache.org/mod_mbox/avro-user/201110.mbox/<cacn4pvv8kxaartsgppe2fi1cpq443gd1gnratf-3szvebag...@mail.gmail.com> > > I'm wondering if anybody out there has an example of how to setup/run an > Avro MapReduce job without relying on the AvroJob.set* helper methods. > Or better yet, an Oozie example of the same. > > Thanks, > > Frank Grimes > > -- Harsh J Customer Ops. Engineer, Cloudera