[ 
https://issues.apache.org/jira/browse/AMBARI-13881?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15004117#comment-15004117
 ] 

Hudson commented on AMBARI-13881:
---------------------------------

FAILURE: Integrated in Ambari-trunk-Commit #3829 (See 
[https://builds.apache.org/job/Ambari-trunk-Commit/3829/])
AMBARI-13881. Devdeploy: YARN,Mahout service checks fail on all OSes (aonishuk: 
[http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=a05e5b6b55f400cbda163ceec385f07a4d670328])
* 
ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/params_linux.py
* 
ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/yarn.py
* ambari-server/src/test/python/stacks/2.0.6/YARN/test_historyserver.py


> Devdeploy: YARN,Mahout service checks fail on all OSes
> ------------------------------------------------------
>
>                 Key: AMBARI-13881
>                 URL: https://issues.apache.org/jira/browse/AMBARI-13881
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 2.1.3
>
>
>     Traceback (most recent call last):
>       File 
> "/var/lib/ambari-agent/cache/common-services/MAHOUT/1.0.0.2.3/package/scripts/service_check.py",
>  line 79, in <module>
>         MahoutServiceCheck().execute()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 218, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/common-services/MAHOUT/1.0.0.2.3/package/scripts/service_check.py",
>  line 66, in service_check
>         user = params.smokeuser
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 154, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 156, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 119, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 238, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 70, in inner
>         result = function(command, **kwargs)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 92, in checked_call
>         tries=tries, try_sleep=try_sleep)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 140, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 291, in _call
>         raise Fail(err_msg)
>     resource_management.core.exceptions.Fail: Execution of 'mahout 
> seqdirectory --input /user/ambari-qa/mahoutsmokeinput/sample-mahout-test.txt 
> --output /user/ambari-qa/mahoutsmokeoutput/ --charset utf-8' returned 1. 
> MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
>     Running on hadoop, using /usr/hdp/current/hadoop-client/bin/hadoop and 
> HADOOP_CONF_DIR=/usr/hdp/current/hadoop-client/conf
>     MAHOUT-JOB: 
> /usr/hdp/2.3.4.0-3235/mahout/mahout-examples-0.9.0.2.3.4.0-3235-job.jar
>     WARNING: Use "yarn jar" to launch YARN applications.
>     15/11/13 12:17:46 WARN driver.MahoutDriver: No seqdirectory.props found 
> on classpath, will use command-line arguments only
>     15/11/13 12:17:47 INFO common.AbstractJob: Command line arguments: 
> {--charset=[utf-8], --chunkSize=[64], --endPhase=[2147483647], 
> --fileFilterClass=[org.apache.mahout.text.PrefixAdditionFilter], 
> --input=[/user/ambari-qa/mahoutsmokeinput/sample-mahout-test.txt], 
> --keyPrefix=[], --method=[mapreduce], 
> --output=[/user/ambari-qa/mahoutsmokeoutput/], --startPhase=[0], 
> --tempDir=[temp]}
>     15/11/13 12:17:49 INFO impl.TimelineClientImpl: Timeline service address: 
> http://os-u14-fxhqus-devdeploy-1.novalocal:8188/ws/v1/timeline/
>     15/11/13 12:17:49 INFO client.RMProxy: Connecting to ResourceManager at 
> os-u14-fxhqus-devdeploy-2.novalocal/172.22.82.23:8050
>     15/11/13 12:17:49 INFO service.AbstractService: Service 
> org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl failed in state 
> STARTED; cause: java.io.IOException: /tmp/entity-file-history/active does not 
> exist
>     java.io.IOException: /tmp/entity-file-history/active does not exist
>       at 
> org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl.serviceStart(TimelineClientImpl.java:366)
>       at 
> org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
>       at 
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:194)
>       at 
> org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
>       at 
> org.apache.hadoop.mapred.ResourceMgrDelegate.serviceStart(ResourceMgrDelegate.java:109)
>       at 
> org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
>       at 
> org.apache.hadoop.mapred.ResourceMgrDelegate.<init>(ResourceMgrDelegate.java:98)
>       at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:112)
>       at 
> org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34)
>       at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95)
>       at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
>       at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
>       at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260)
>       at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>       at org.apache.hadoop.mapreduce.Job.connect(Job.java:1255)
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284)
>       at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
>       at 
> org.apache.mahout.text.SequenceFilesFromDirectory.runMapReduce(SequenceFilesFromDirectory.java:183)
>       at 
> org.apache.mahout.text.SequenceFilesFromDirectory.run(SequenceFilesFromDirectory.java:91)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>       at 
> org.apache.mahout.text.SequenceFilesFromDirectory.main(SequenceFilesFromDirectory.java:65)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
>       at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
>       at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:152)
>       at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>     



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to