ap multiple concurrent sqoop jobs
> to a total number of mappers.
>
> regards
> /Pelle
>
> On Fri, May 27, 2016 at 8:08 AM, Harsh J wrote:
>
> > Perhaps the feature of
> > https://issues.apache.org/jira/browse/MAPREDUCE-5583 is
> > what you are looking for
Perhaps the feature of https://issues.apache.org/jira/browse/MAPREDUCE-5583 is
what you are looking for.
On Fri, 27 May 2016 at 00:04 Per Ullberg wrote:
> The fair scheduler would solve this issue, but we need the capacity
> scheduler for other reason. Would it be possible to run multiple schedu
. No of retries = 1. Exception = Could not authenticate,
>> GSSException: No valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)
>>
>>
>> I have verified all kerberos principals, all are working fine. Please let
>> me know how to resolve this issue.
>>
>> Thanks,
>> Shaik
>>
--
Harsh J
Could you elaborate on 'proper'? What's your situation, and what's your error?
On Wed, Jun 17, 2015 at 6:45 PM, Shouvanik Haldar
wrote:
> Does have anyone an idea or working solution on this?
--
Harsh J
s a JavaAction or as a ShellAction. Problem is both
> JavaAction and ShellAction launchers invoke the custom yarn app as
> 'MAPREDUCE' instead of 'YARN'.
>
> Has any one tried something similar? Appreciate any pointers.
>
> Thanks,
> Som
--
Harsh J
tax of
> using this in
> action of oozie ?
--
Harsh J
ite some logs in a directory but bydefault it's taking mapred
> user but the requirement is whoever submit the oozie job that user should
> reflect in the log files.
>
> Please help me!
>
--
Harsh J
log4j configuration for a workflow with Java
> actions
> I am trying to divert java actions to a socket appender.
> using Oozie 4.0.0.
>
> quick help will be greatly appreciated.
>
> Thanks,
> Vishal
--
Harsh J
and every year.
On Thu, Nov 27, 2014 at 8:30 PM, prabha k wrote:
> Hi Experts, Can you please let me know how to schedule a Job to run on
> Monday and Friday only ?
>
> Thanks in Advance.
>
> PK
--
Harsh J
]
>
> Ken Kavaliauskas
>
> Software Engineer
>
>
>
> Ericsson
>
> kenneth.kavaliaus...@ericsson.com
>
> www.ericsson.com
>
>
>
> [image: Description: Description:
> http://insite.telcordia.com/corporate/pr/brand/downloads/templates/EricssonLogo.gif]
>
>
>
--
Harsh J
ieClient, besides this anything else we
> need to do ?
>
> Thanks a lot for your help!
>
> Best regards,
>
>
> sophie
--
Harsh J
a action(HBase MR) does not inherit the class.
>
> Looking for a workaround.
>
> Thanks,
> Vishal
--
Harsh J
educe job.
>
> job.setInputFormatClass(AvroKeyInputFormat.class);
> job.setMapOutputKeyClass(ImmutableBytesWritable.class);
> job.setMapOutputValueClass(Put.class);
>
> HFileOutputFormat.configureIncrementalLoad(job, table);
>
> How can I configure this in MR action?
>
> Thanks,
> Vishal Kapoor
--
Harsh J
if a input file, def.txt is dropped into hdfs , a
> pig job should run .
>
>Can anyone suggest me whether we can acheive this in HUE ?
>
>
>
> Thanks
> sivakumar
--
Harsh J
sonate others.
>> On Feb 17, 2014, at 1:45 AM, Harsh J wrote:
>>
>> Oozie uses Hadoop's impersonation feature documented at
>> https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/Superusers.html.
>> The hadoop.proxyuser configs must reside on the
&
imeout
> 120
>
>
> oozie.credentials.credentialclasses
> hcat=org.apache.oozie.action.hadoop.HCatCredentials
>
>
> oozie.service.CallableQueueService.queue.size
> 1000
>
>
> oozie.service.JPAService.jdbc.password
> admin
>
>
> oozie.service.HadoopAccessorService.nameNode.whitelist
>
>
>
> oozie.service.PurgeService.older.than
> 30
>
>
> oozie.service.HadoopAccessorService.jobTracker.whitelist
>
>
>
> oozie.systemmode
> NORMAL
>
>
> oozie.service.PurgeService.purge.interval
> 3600
>
>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com
--
Harsh J
llowing works from command line:
> sqoop import –connect jdbc:mysql://10.0.2.15/test –username root
> –target-dir $OUTPUT_DIR –query “select *, DATE_FORMAT(created_time,
> ‘%Y%m%d%H’) as period from $TABLE_NAME where DATE_FORMAT(created_time,
> ‘%Y%m%d%H’) = $PERIOD and \$CONDITIONS” –split-by id
>
> What am I doing wrong? Please help. Thanks.
--
Harsh J
quot;normally" submit oozie jobs ? As themselves? As
> user=oozie?
>
> 2) What about in a linux container environment : Then how would someone
> typically submit an oozie job?
>
> Thanks!
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com
--
Harsh J
ded partition to metastore
> checks:p_date=2013-04-29/p_merchant_id=98
> Repair: Added partition to metastore
> checks:p_date=2013-04-30/p_merchant_id=98
> Repair: Added partition to metastore
> checks:p_date=2013-05-01/p_merchant_id=98
> Repair: Added partition to metastore
> che
ive action is:
>
>
> ${jobTracker}
> ${nameNode}
> ${currentAutomation}/job-defaults.xml
> updatepartitions.q
> scope=${scope}
> datasetBase=${datasetBase}
> avroSchemaFile=${avroSchemaFile}
>
>
> Any thoughts or feedback are much appreciated. Thanks!
>
>
> Best,
> Andrew
--
Harsh J
n a java action with
your implemented class.
On Tue, Jan 28, 2014 at 9:47 PM, Vishnu Viswanath
wrote:
> Can oozie schedule a mongodb action?
>
> Regrds
--
Harsh J
TOKEN[] APP[hlab-oozie-workflow]
>> JOB[012-131218165312793-oozie-oozi-W]
>> ACTION[012-131218165312793-oozie-oozi-W@convert-file-fail]
>> [***012-131218165312793-oozie-oozi-W@convert-file-fail***]Action
>> updated in DB!
>> 2013-12-19 00:41:29,440 WARN CoordActionUpdateXCommand:542 - USER[hdfs]
>> GROUP[-] TOKEN[] APP[hlab-oozie-workflow]
>> JOB[012-131218165312793-oozie-oozi-W] ACTION[-] E1100: Command
>> precondition does not hold before execution, [, coord action is null],
>> Error Code: E1100
>>
>> The FIleNotFoundException is the result of the property
>> com.sdl.sap.conf.directory not being set.
>>
>> Any help would be really appreciated!
>>
>>
>>
>>
>> http://www.sdl.com/?utm_source=Email&utm_medium=Email%2BSignature&utm_campaign=SDL%2BStandard%2BEmail%2BSignature
>> ">
>> http://www.sdl.com/Content/themes/common/images/SDL_logo_strapline_GCEM_EmailSig_150x68px.jpg";
>> border=0>www.sdl.com
>>
>>
>>
>>
>> SDL PLC confidential, all rights reserved.
>>
>> If you are not the intended recipient of this mail SDL requests and
>> requires that you delete it without acting upon or copying any of its
>> contents,
>> and we further request that you advise us.
>> SDL PLC is a public limited company registered in England and Wales.
>> Registered number: 02675207.
>> Registered address: Globe House, Clivemont Road, Maidenhead, Berkshire SL6
>> 7DY, UK.
>>
>>
>>
>> This message has been scanned for malware by Websense. www.websense.com
>>
--
Harsh J
t;
>
>
>
>
> ${jobTracker}
> ${nameNode}
>
>
> mapred.job.queue.name
> ${queueName}
>
>
> org.ltc.command.LogMain
> Log From Ozzie
>
>
>
> ...
>
--
Harsh J
gt; > >
>> > > > > > > >4) In oozie-site.xml, set (or add if it doesn't exist) a
>> > property
>> > > > like
>> > > > > > > >this:
>> > > > > > > >
>> > > > > > > >
>> oozie.service.ELService.ext.functions.workflow
>> > > > > > > >
>> > > > > > > >
>> > functionName=my.package.AwesomeELFunctions#functionName
>> > > > > > > >
>> > > > > > > >
>> > > > > > > >If you have more functions, that property takes a comma
>> > separated
>> > > > list
>> > > > > > of
>> > > > > > > >them. The value on the left side of the equals is the name
>> that
>> > > > you'd
>> > > > > > use
>> > > > > > > >in a workflow and the value on the right side is the name of
>> > your
>> > > > > > function
>> > > > > > > >in your Java code. For the value on the left side, you can
>> > > > optionally
>> > > > > > put
>> > > > > > > >a prefix (you may have seen many built-in functions that start
>> > > with
>> > > > > > "wf:"
>> > > > > > > >for example).
>> > > > > > > >
>> > > > > > > >5) Start up Oozie. You should now be able to use the
>> > functionName
>> > > > > > custom
>> > > > > > > >EL Function you created!
>> > > > > > > >
>> > > > > > > >I believe this may only make the function available in
>> > workflows,
>> > > > but
>> > > > > > I'm
>> > > > > > > >not sure; if not, there should be a similar property in
>> > oozie-site
>> > > > you
>> > > > > > can
>> > > > > > > >set for coordinators if you need that.
>> > > > > > > >
>> > > > > > > >Once my proper tutorial blog post is posted, I'll add a link
>> to
>> > > this
>> > > > > > > >thread, but it may be a while.
>> > > > > > > >
>> > > > > > > >- Robert
>> > > > > > > >
>> > > > > > > >
>> > > > > > > >
>> > > > > > > >
>> > > > > > > >On Fri, Jul 12, 2013 at 7:26 AM, Serega Sheypak
>> > > > > > > >wrote:
>> > > > > > > >
>> > > > > > > >> Hi, I need to create custom EL function.
>> > > > > > > >> It accepts:
>> > > > > > > >> String StringWithDatetime
>> > > > > > > >> String Pattern (to parse Date)
>> > > > > > > >> It returns:
>> > > > > > > >> time in seconds.
>> > > > > > > >>
>> > > > > > > >> Please tell me:
>> > > > > > > >> 1. Where can I find example?
>> > > > > > > >> 2. Where do I have to put implementation of this function?
>> > > > > > > >>
>> > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>>
--
Harsh J
ation makes Oozie more complex than needed for my specific
> task).
>
> LAT
--
Harsh J
l, I set the action's namenode
> and jobtracker properties to a remote ec2 cluster. So, even if oozie spawns
> a local map task for my action, the underlying map-reduce job itself gets
> executed on the remote ec2 cluster. Can we do such a thing with oozie?
>
> Thanks,
> Som
--
Harsh J
ERROR, reason: Main class
> [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
> ERROR is considered as FAILED for SLA
>
>
>
> NOTE: This e-mail message is subject to the MTN Group disclaimer see
> http://www.mtn.co.za/SUPPORT/LEGAL/Pages/EmailDisclaimer.aspx
--
Harsh J
tion in error, please notify
> us immediately by responding to this email and then delete it from your
> system. The firm is neither liable for the proper and complete transmission
> of the information contained in this communication nor for any delay in its
> receipt.
--
Harsh J
oba/final.txt' returned non-zero exit status
> 1
> Failing Oozie Launcher, Main class
> [org.apache.oozie.action.hadoop.ShellMain], exit code [1]
>
>
> It seems that I cannot run a shell command line within my python script when
> the python script is embedded within an oozie action since everything works
> fine when I run my python script within my interactive shell.
> The log also says that the main class org.apache.hadoop.fs.FsShell is
> missing whereas I copied hadoop-core-1.2.1.jar in a lib folder next to my
> workflow.xml and job.properties files.
>
> Is there any way I can bypass this limitation ?
--
Harsh J
t; state The task with
> original name is in "Accepted" state. All that I see in logs is:
>
> >>> Invoking Main class now >>>
> Heart beat
> Heart beat
> Heart beat
> Heart beat
> ...
>
>
> Thank you
--
Harsh J
gt;
> On 7/12/13 9:54 AM, "Alexander Taggart" wrote:
>
>>I have a custom action executor that
>>invokes
>>org.apache.oozie.action.ActionExecutor.Context.setExecutionData(String,
>>Properties) upon completion. I cannot find any way to access that data
>>through the oozie console.
>
--
Harsh J
c.LoggingConnectionDecorator.wrap(
> LoggingConnectionDecorator.java:265)
> at org.apache.openjpa.lib.jdbc.LoggingConnectionDecorator.
> access$700(LoggingConnectionDecorator.java:72)
> at org.apache.openjpa.lib.jdbc.LoggingConnectionDecorator$
> LoggingConnection$LoggingPreparedStatement.executeQuery(
> LoggingConnectionDecorator.java:1183)
> at org.apache.openjpa.lib.jdbc.DelegatingPreparedStatement.
> executeQuery(DelegatingPreparedStatement.java:284)
> at org.apache.openjpa.jdbc.kernel.JDBCStoreManager$
> CancelPreparedStatement.executeQuery(JDBCStoreManager.java:1785)
> at org.apache.openjpa.lib.jdbc.DelegatingPreparedStatement.
> executeQuery(DelegatingPreparedStatement.java:274)
> at org.apache.openjpa.jdbc.kernel.PreparedSQLStoreQuery$
> PreparedSQLExecutor.executeQuery(PreparedSQLStoreQuery.java:118)
> ... 39 more
>
> After resting the service, the systems will run fine.
>
> I have no idea where to debug or how to solve this.
>
>
> Best Regards,
> Christian.
--
Harsh J
o not create workflows
> whose Action Time is early than current time? Thanks!
--
Harsh J
gt;> >> Thanks
>> >>
>> >>
>> >> On Thu, May 30, 2013 at 10:29 PM, Felix.徐 wrote:
>> >>
>> >> > Hi all,
>> >> >
>> >> > How to coordinate multiple workflows into a single job. I do not
>> >>really
>> >> > want to merge all the codes into a single workflow.xml , is there a
>> >>way
>> >> to
>> >> > "invoke" more than one workflows within another job?
>> >> >
>> >>
>> >>
>> >>
>> >> --
>> >> Alejandro
>> >>
>>
>>
--
Harsh J
specific reason for the Launcher Job. Why couldn't oozie
> kickoff just the Action MR job like we do from a command line?
>
> Your answers are appreciated.
>
> Thanks
> Kishore
--
Harsh J
g the MR job. Looks like oozie disables
>> the success file creation using the configuration that you have mentioned
>> for FileOutputCommitter.
>>
>> I have enabled it by setting this property in conf.
>>
>> Rahul
>>
>>
>> On Mon, May 6, 2
wrapping it in a M only job, or is it just of off load the oozie
> server?
>
> Thanks,
> Rahul
--
Harsh J
ss remains in PREP state.
>>>
>>> While Checking the Oozie Job log I found:
>>>
>>> “JavaActionExecutor:542 - USER[user1] GROUP[-] TOKEN[]
>>> APP[java-oozie-wf] JOB[003-121206171016662-oozie-user-W]
>>> ACTION[003-121206171016662-oozie-user-W@java-node] credentials is
>>> null for the action”
>>>
>>> Can you please help me in moving ahead from this. :)
>>>
>>> --
>>> Thanks and Regards,
>>> Uddipan
>>
>>
> --
>
>
>
--
Harsh J
at am I doing wrong. I can assure you that the input
> path/file exists on s3 and the AWS key and secret key entered are correct.
>
> Thanking You,
>
>
> --
> Regards,
> Ouch Whisper
> 010101010101
--
Harsh J
gt;> 1598 [main] WARN org.apache.sqoop.tool.SqoopTool - $SQOOP_CONF_DIR has
>> not been set in the environment. Cannot check for additional configuration.
>> Intercepting System.exit(1)
>>
>> <<< Invocation of Main class completed <<<
>>
>> Fa
ace issues when building Oozie.
> Especially in our corporate environment we have separate Maven and proxy
> settings
>
> Is there a way or place we can get the pre built distribution of Oozie.
>
> Thanks
> Venkat
>
--
Harsh J
ame thing if run it
> through Oozie it throws error saying the path 'downloads/test/' as invalid
> path.
>
> As i understood Oozie run over hdfs, so it is expecting files in hdfs.
>
> I am new to hadoop and oozie. Is there anyway we can achieve this through
> Oozie?
>
> is there any better work flow scheduler we can use?
>
> Thanks
> Shreehari
--
Harsh J
ursively?
>
>
> mapred.input.dir
> /foo/bar
>
>
> I see that I need to wildcard the path if "bar" contains sub-directories.
> Any way around that with mr action?
>
> Thanks,
> Prashant
>
--
Harsh J
ase find answers to your
> questions below(in order) :
>
> 1- Yes, it works.
> 2- Yes, jar contains all the classes.
> 3- I am trying to use oozie-3.2.0-incubating + hadoop-1.0.4. Is this
> combination OK?
>
> Regards,
> Mohammad Tariq
>
>
>
> On Sun, Dec 9, 20
lassLoader.loadClass(ClassLoader.java:356)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:264)
> at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
> at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865)
> ... 17 more
>
>
> Here is the command which I am using to submit the workfloe -
> bin/oozie job -oozie http://localhost:11000/oozie/ -config
> ~/mr/job.properties -run
>
> Need some help. Many thanks.
> (Please pardon my ignorance)
>
> Regards,
> Mohammad Tariq
--
Harsh J
ial
> distros that are built on top of Bigtop.
>
> As for the ExtJS -- all you need to do is unpack it down in /var/lib/oozie/
> and you're done!
>
> Thanks,
> Roman.
>
--
Harsh J
hadoop.proxyuser.oozie.groups
> *
>
>
> Anything else need to be configured in order to be able to run oozie jobs?
>
> Thanks.
--
Harsh J
at
> org.apache.oozie.service.Services.setServiceInternal(Services.java:358)
> at org.apache.oozie.service.Services.setService(Services.java:344)
> at org.apache.oozie.service.Services.loadServices(Services.java:278)
> ... 26 more
> -
--
Harsh J
nso Ferreira <
> eafon...@yahoo.com> wrote:
>
>> Hey, quick question:
>>
>> Can we use a variable for the coordinator-app name? I tried it the other
>> day but it apparently does not work.
>> I'm talking about using something like this:
>>
>>
>>
>> Thank you.
>> Eduardo.
--
Harsh J
de correo electr?nico en el enlace
> situado m?s abajo.
> This message is intended exclusively for its addressee. We only send and
> receive email on the basis of the terms set out at:
> http://www.tid.es/ES/PAGINAS/disclaimer.aspx
--
Harsh J
tions or workflow apps in one
> coordinator app? If not, what is the best practice for this scenario?
>
> Thanks!
>
> Yongcheng
--
Harsh J
ie source code?
> I did not find documentation about how to implement custom EL Functions.
> Would someone throw some links my way about where I can find information on
> this?
>
> Thank you.
> Eduardo.
--
Harsh J
ctory structure pushed to HDFS under /user/awesome/ and you're good
> to go.
>
>
> Eduardo.
>
>
>
>
> From: Harsh J
> To: user@oozie.apache.org
> Sent: Wednesday, November 7, 2012 2:52 PM
> Subject: Re: Pig action, REGISTER and additiona
Nov 8, 2012 at 1:12 AM, Grant Ingersoll wrote:
>
> On Nov 7, 2012, at 12:51 PM, Harsh J wrote:
>
>> Hi Grant,
>>
>> You can leverage the feature of the Pig action, in tandem
>> with the distributed-cache-using element to do this I think
>> (over pig action schema 0.2).
>>
>> If you add after your
Hi Grant,
You can leverage the feature of the Pig action, in tandem
with the distributed-cache-using element to do this I think
(over pig action schema 0.2).
If you add after your
ls in the fair scheduler.
>
> Does anyone have any experience with this or know if this will work? What
> is the practical differentiation of specifying a queue for Oozie when I am
> being directed to a pool already?
>
> --
> Matt
--
Harsh J
with
other systems.
As far as "inbuilt" is concerned, I guess the answer is no currently,
but contributions are welcome.
On Wed, Nov 7, 2012 at 9:29 PM, Ramasubramanian Narayanan
wrote:
> Hi,
>
> Can we use Pentaho Jobs in oozie?
>
> regards,
> Rams
--
Harsh J
s not supported by
> Oozie.
> How do we include third party libraries in Java action.
> One option I am thinking is include my hadoop command in a shell script
> and use shell script option instead of Java action.
>
> Regards,
> Bhasker
--
Harsh J
e any
>> objections. Please respond before EOD TUE if you have a concern with
>> using user@ instead of users@.
>>
>> Thanks.
>>
>> --
>> Alejandro
>>
--
Harsh J
59 matches
Mail list logo