ssage[${wf:errorMessage(wf:lastErrorNode())}]
renguihe
hi,
I encountered JA008 error when reruning a coordinator action.
The already executed action of the workflow job are too expensive to be
re-executed,so I can't rerun the coordinator action ,I rerun it as a workflow
instead .
In my job. properties, I set:
oozie.wf.rerun.failnodes=true
or
oozie
hi,
I want to get unique form of the outputdata uri.
In my coordinator.xml,I write like this:
${nameNode}/user/hadoop/examples/output-data/formal/${YEAR}/${MONTH}/${DAY}/
In my job.properties,I write like this:
${replaceAll(${outputData},"[^0-9]",null)}
I wann to g
owKillXCommand for
jobId=001-13082227563-oozie-hado-W2013-11-11 08:52:11,824 WARN
ActionEndXCommand:542 - USER[root] GROUP[-] TOKEN[] APP['customtest']
JOB[001-1308222 Waiting for your help sincerely!
Best Wishes!
HENRY
From: Mohammad Islam
Date: 2013-11-11 11:52
Hi,
I try to use Custom Action Nodes.
The workflow can be submitted but hangs on "PREP" status.
I wonder which step is wrong.
I followed the steps according this article .
http://www.infoq.com/articles/ExtendingOozie
I put a jar containing EmailActionExecutor class and emailAction.xsd in the
ooz
user
Subject: Re: hi,how can I complete this DAG in a workflow using oozie?
the best parallelism you can get for this in a wf is:
fork 1, 2 join fork 3, 4, 5 join
thx
Alejandro
(phone typing)
On Nov 8, 2013, at 5:45, renguihe wrote:
> hi,
> Add the error info I have got here when I try to
b.queue.name
> ${queueName}
>
>
> sleep
> 10
>
>
>
>
>
>
>
> ${jobTracker}
> ${nameNode}
>
>
> mapred.job.queue.name
> ${queueName}
>
>
> sleep
> 15
>
>
>
>
>
>
>
> Shell action failed, error
> message[${wf:errorMessage(wf:lastErrorNode())}]
>
>
>
>
>
> --
> renguihe
>
>
> mapred.job.queue.name
> ${queueName}
>
>
> sleep
> 10
>
>
>
>
>
>
>
> ${jobTracker}
> ${nameNode}
>
>
> mapred.job.queue.name
> ${queueName}
>
>
> sleep
> 15
>
>
>
>
>
>
>
> Shell action failed, error
> message[${wf:errorMessage(wf:lastErrorNode())}]
>
>
>
>
>
> --
> renguihe
>
hi,
I guess oozie shell action runs in a seperate sandbox, can not read computable
node's local environment variable .
Now I want to use "sh" and "hive" commands in my shell action.
The seperate sandbox
$PATH=PATH=.:/usr/kerberos/bin:/usr/local/bin:/bin:/usr/bin:
I append this line into workflow.