[ 
https://issues.apache.org/jira/browse/AMBARI-9654?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Andrew Onischuk resolved AMBARI-9654.
-------------------------------------
    Resolution: Fixed

Committed to trunk

> Hive and Oozie could not start after upgrade from 1.5.1 to 2.0.0
> ----------------------------------------------------------------
>
>                 Key: AMBARI-9654
>                 URL: https://issues.apache.org/jira/browse/AMBARI-9654
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 2.0.0
>
>
> Cluster: <http://172.18.145.176:8080/#/main/services/OOZIE/summary>  
> EXTERNAL HOSTNAMES ****************  
> 172.18.145.176  
> 172.18.145.139  
> 172.18.145.92  
> 172.18.145.137  
> 172.18.145.136  
> INTERNAL HOSTNAMES *******************  
> amb-upg151-rhel6postgres1423903811-10.cs1cloud.internal  
> amb-upg151-rhel6postgres1423903811-5.cs1cloud.internal  
> amb-upg151-rhel6postgres1423903811-1.cs1cloud.internal  
> amb-upg151-rhel6postgres1423903811-4.cs1cloud.internal  
> amb-upg151-rhel6postgres1423903811-3.cs1cloud.internal
> **STR:**  
> 1)Deploy old version with all services (HIVE_DB=Mysql (newMyaql)  
> OOZIE_DB=Mysql (newMyaql))  
> 2)Make ambari only upgrade to 2.0.0
> **Actual result:**  
> Hive (Hive Metastore) and Oozie (Oozie server) could not start.
> Hive Metastore
>     
>     
>     
>     stderr:   /var/lib/ambari-agent/data/errors-428.txt
>     
>     Traceback (most recent call last):
>       File 
> "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py",
>  line 133, in <module>
>         HiveMetastore().execute()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 208, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py",
>  line 53, in start
>         self.configure(env)  # FOR SECURITY
>       File 
> "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_metastore.py",
>  line 46, in configure
>         hive(name = 'metastore')
>       File 
> "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py",
>  line 76, in hive
>         jdbc_connector()
>       File 
> "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py",
>  line 201, in jdbc_connector
>         content = DownloadSource(params.driver_curl_source),
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 148, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 152, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 118, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 110, in action_create
>         content = self._get_content()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 152, in _get_content
>         return content()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 
> 49, in __call__
>         return self.get_content()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 
> 177, in get_content
>         web_file = opener.open(req)
>       File "/usr/lib64/python2.6/urllib2.py", line 397, in open
>         response = meth(req, response)
>       File "/usr/lib64/python2.6/urllib2.py", line 510, in http_response
>         'http', request, response, code, msg, hdrs)
>       File "/usr/lib64/python2.6/urllib2.py", line 435, in error
>         return self._call_chain(*args)
>       File "/usr/lib64/python2.6/urllib2.py", line 369, in _call_chain
>         result = func(*args)
>       File "/usr/lib64/python2.6/urllib2.py", line 518, in http_error_default
>         raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
>     urllib2.HTTPError: HTTP Error 404: Not Found
>     stdout:   /var/lib/ambari-agent/data/output-428.txt
>     
>     2015-02-14 11:44:28,969 - u"Group['hadoop']" {'ignore_failures': False}
>     2015-02-14 11:44:28,970 - Modifying group hadoop
>     2015-02-14 11:44:29,111 - u"Group['nobody']" {'ignore_failures': False}
>     2015-02-14 11:44:29,112 - Modifying group nobody
>     2015-02-14 11:44:29,210 - u"Group['users']" {'ignore_failures': False}
>     2015-02-14 11:44:29,211 - Modifying group users
>     2015-02-14 11:44:29,327 - u"User['hive']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,328 - Modifying user hive
>     2015-02-14 11:44:29,352 - u"User['oozie']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2015-02-14 11:44:29,352 - Modifying user oozie
>     2015-02-14 11:44:29,370 - u"User['nobody']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'nobody']}
>     2015-02-14 11:44:29,371 - Modifying user nobody
>     2015-02-14 11:44:29,390 - u"User['ambari-qa']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2015-02-14 11:44:29,391 - Modifying user ambari-qa
>     2015-02-14 11:44:29,411 - u"User['hdfs']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,412 - Modifying user hdfs
>     2015-02-14 11:44:29,437 - u"User['storm']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,437 - Modifying user storm
>     2015-02-14 11:44:29,461 - u"User['mapred']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,462 - Modifying user mapred
>     2015-02-14 11:44:29,484 - u"User['hbase']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,484 - Modifying user hbase
>     2015-02-14 11:44:29,502 - u"User['tez']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2015-02-14 11:44:29,503 - Modifying user tez
>     2015-02-14 11:44:29,521 - u"User['zookeeper']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,521 - Modifying user zookeeper
>     2015-02-14 11:44:29,536 - u"User['falcon']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,536 - Modifying user falcon
>     2015-02-14 11:44:29,555 - u"User['sqoop']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,556 - Modifying user sqoop
>     2015-02-14 11:44:29,578 - u"User['yarn']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,578 - Modifying user yarn
>     2015-02-14 11:44:29,594 - u"User['hcat']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:44:29,595 - Modifying user hcat
>     2015-02-14 11:44:29,612 - 
> u"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']" {'content': 
> StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2015-02-14 11:44:29,630 - 
> u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa 
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']"
>  {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>     2015-02-14 11:44:29,637 - Skipping 
> u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa 
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']"
>  due to not_if
>     2015-02-14 11:44:29,637 - u"Directory['/grid/0/hadoop/hbase']" {'owner': 
> 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
>     2015-02-14 11:44:29,698 - 
> u"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']" {'content': 
> StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2015-02-14 11:44:29,716 - 
> u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase 
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase']" 
> {'not_if': 'test $(id -u hbase) -gt 1000'}
>     2015-02-14 11:44:29,724 - Skipping 
> u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase 
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase']" 
> due to not_if
>     2015-02-14 11:44:29,725 - u"Group['hdfs']" {'ignore_failures': False}
>     2015-02-14 11:44:29,726 - Modifying group hdfs
>     2015-02-14 11:44:29,844 - u"User['hdfs']" {'groups': [u'hadoop', 'hdfs', 
> 'hadoop', u'hdfs']}
>     2015-02-14 11:44:29,844 - Modifying user hdfs
>     2015-02-14 11:44:29,870 - u"Directory['/etc/hadoop']" {'mode': 0755}
>     2015-02-14 11:44:29,871 - u"Directory['/etc/hadoop/conf.empty']" 
> {'owner': 'hdfs', 'group': 'hadoop', 'recursive': True}
>     2015-02-14 11:44:29,872 - u"Link['/etc/hadoop/conf']" {'not_if': 'ls 
> /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
>     2015-02-14 11:44:29,879 - Skipping u"Link['/etc/hadoop/conf']" due to 
> not_if
>     2015-02-14 11:44:29,901 - u"File['/etc/hadoop/conf/hadoop-env.sh']" 
> {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
>     2015-02-14 11:44:29,933 - u"Execute['('setenforce', '0')']" {'sudo': 
> True, 'only_if': 'test -f /selinux/enforce'}
>     2015-02-14 11:44:29,959 - Skipping u"Execute['('setenforce', '0')']" due 
> to only_if
>     2015-02-14 11:44:29,959 - u"Directory['/grid/0/log/hadoop']" {'owner': 
> 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, 'cd_access': 'a'}
>     2015-02-14 11:44:30,022 - u"Directory['/var/run/hadoop']" {'owner': 
> 'root', 'group': 'root', 'recursive': True, 'cd_access': 'a'}
>     2015-02-14 11:44:30,059 - u"Directory['/tmp/hadoop-hdfs']" {'owner': 
> 'hdfs', 'recursive': True, 'cd_access': 'a'}
>     2015-02-14 11:44:30,097 - 
> u"File['/etc/hadoop/conf/commons-logging.properties']" {'content': 
> Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
>     2015-02-14 11:44:30,119 - u"File['/etc/hadoop/conf/health_check']" 
> {'content': Template('health_check-v2.j2'), 'owner': 'hdfs'}
>     2015-02-14 11:44:30,133 - u"File['/etc/hadoop/conf/log4j.properties']" 
> {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
>     2015-02-14 11:44:30,160 - 
> u"File['/etc/hadoop/conf/hadoop-metrics2.properties']" {'content': 
> Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
>     2015-02-14 11:44:30,171 - 
> u"File['/etc/hadoop/conf/task-log4j.properties']" {'content': 
> StaticFile('task-log4j.properties'), 'mode': 0755}
>     2015-02-14 11:44:30,188 - u"File['/etc/hadoop/conf/configuration.xsl']" 
> {'owner': 'hdfs', 'group': 'hadoop'}
>     2015-02-14 11:44:30,386 - u"Directory['/etc/hive']" {'mode': 0755}
>     2015-02-14 11:44:30,387 - u"Directory['/etc/hive/conf']" {'owner': 
> 'hive', 'group': 'hadoop', 'recursive': True}
>     2015-02-14 11:44:30,387 - u"XmlConfig['mapred-site.xml']" {'group': 
> 'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644, 
> 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
>     2015-02-14 11:44:30,404 - Generating config: 
> /etc/hive/conf/mapred-site.xml
>     2015-02-14 11:44:30,405 - u"File['/etc/hive/conf/mapred-site.xml']" 
> {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 
> 0644, 'encoding': 'UTF-8'}
>     2015-02-14 11:44:30,480 - Writing 
> u"File['/etc/hive/conf/mapred-site.xml']" because contents don't match
>     2015-02-14 11:44:30,510 - 
> u"File['/etc/hive/conf/hive-default.xml.template']" {'owner': 'hive', 
> 'group': 'hadoop'}
>     2015-02-14 11:44:30,510 - u"File['/etc/hive/conf/hive-env.sh.template']" 
> {'owner': 'hive', 'group': 'hadoop'}
>     2015-02-14 11:44:30,511 - 
> u"File['/etc/hive/conf/hive-exec-log4j.properties']" {'content': '...', 
> 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>     2015-02-14 11:44:30,529 - u"File['/etc/hive/conf/hive-log4j.properties']" 
> {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>     2015-02-14 11:44:30,539 - u"Directory['/etc/hive/conf.server']" {'owner': 
> 'hive', 'group': 'hadoop', 'recursive': True}
>     2015-02-14 11:44:30,539 - u"XmlConfig['mapred-site.xml']" {'group': 
> 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 
> 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
>     2015-02-14 11:44:30,553 - Generating config: 
> /etc/hive/conf.server/mapred-site.xml
>     2015-02-14 11:44:30,554 - 
> u"File['/etc/hive/conf.server/mapred-site.xml']" {'owner': 'hive', 'content': 
> InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>     2015-02-14 11:44:30,610 - Writing 
> u"File['/etc/hive/conf.server/mapred-site.xml']" because contents don't match
>     2015-02-14 11:44:30,641 - 
> u"File['/etc/hive/conf.server/hive-default.xml.template']" {'owner': 'hive', 
> 'group': 'hadoop'}
>     2015-02-14 11:44:30,642 - 
> u"File['/etc/hive/conf.server/hive-env.sh.template']" {'owner': 'hive', 
> 'group': 'hadoop'}
>     2015-02-14 11:44:30,643 - 
> u"File['/etc/hive/conf.server/hive-exec-log4j.properties']" {'content': 
> '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>     2015-02-14 11:44:30,660 - 
> u"File['/etc/hive/conf.server/hive-log4j.properties']" {'content': '...', 
> 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>     2015-02-14 11:44:30,673 - u"XmlConfig['hive-site.xml']" {'group': 
> 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 
> 'configuration_attributes': {}, 'owner': 'hive', 'configurations': ...}
>     2015-02-14 11:44:30,687 - Generating config: 
> /etc/hive/conf.server/hive-site.xml
>     2015-02-14 11:44:30,687 - u"File['/etc/hive/conf.server/hive-site.xml']" 
> {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 
> 0644, 'encoding': 'UTF-8'}
>     2015-02-14 11:44:30,783 - Writing 
> u"File['/etc/hive/conf.server/hive-site.xml']" because contents don't match
>     2015-02-14 11:44:30,825 - u"File['/etc/hive/conf.server/hive-env.sh']" 
> {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
>     2015-02-14 11:44:30,848 - u"Execute['('rm', '-f', 
> '/usr/lib/hive/lib//ojdbc6.jar')']" {'path': ['/bin', '/usr/bin/'], 'sudo': 
> True}
>     2015-02-14 11:44:30,869 - 
> u"File['/var/lib/ambari-agent/data/tmp/mysql-connector-java.jar']" 
> {'content': 
> DownloadSource('http://amb-upg151-rhel6postgres1423903811-10.cs1cloud.internal:8080/resources//mysql-jdbc-driver.jar')}
>     2015-02-14 11:44:30,869 - Downloading the file from 
> http://amb-upg151-rhel6postgres1423903811-10.cs1cloud.internal:8080/resources//mysql-jdbc-driver.jar
>     
> Oozie Server:
>     
>     
>     
>     stderr:   /var/lib/ambari-agent/data/errors-419.txt
>     
>     Traceback (most recent call last):
>       File 
> "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_server.py",
>  line 170, in <module>
>         OozieServer().execute()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 208, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_server.py",
>  line 60, in start
>         self.configure(env)
>       File 
> "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie_server.py",
>  line 53, in configure
>         oozie(is_server=True)
>       File 
> "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie.py",
>  line 91, in oozie
>         oozie_server_specific()
>       File 
> "/var/lib/ambari-agent/cache/common-services/OOZIE/4.0.0.2.0/package/scripts/oozie.py",
>  line 153, in oozie_server_specific
>         content = DownloadSource(params.driver_curl_source),
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 148, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 152, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 118, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 110, in action_create
>         content = self._get_content()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 152, in _get_content
>         return content()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 
> 49, in __call__
>         return self.get_content()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 
> 177, in get_content
>         web_file = opener.open(req)
>       File "/usr/lib64/python2.6/urllib2.py", line 397, in open
>         response = meth(req, response)
>       File "/usr/lib64/python2.6/urllib2.py", line 510, in http_response
>         'http', request, response, code, msg, hdrs)
>       File "/usr/lib64/python2.6/urllib2.py", line 435, in error
>         return self._call_chain(*args)
>       File "/usr/lib64/python2.6/urllib2.py", line 369, in _call_chain
>         result = func(*args)
>       File "/usr/lib64/python2.6/urllib2.py", line 518, in http_error_default
>         raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
>     urllib2.HTTPError: HTTP Error 404: Not Found
>     stdout:   /var/lib/ambari-agent/data/output-419.txt
>     
>     2015-02-14 11:42:17,726 - u"Group['hadoop']" {'ignore_failures': False}
>     2015-02-14 11:42:17,727 - Modifying group hadoop
>     2015-02-14 11:42:17,850 - u"Group['nobody']" {'ignore_failures': False}
>     2015-02-14 11:42:17,851 - Modifying group nobody
>     2015-02-14 11:42:18,001 - u"Group['users']" {'ignore_failures': False}
>     2015-02-14 11:42:18,002 - Modifying group users
>     2015-02-14 11:42:18,127 - u"User['hive']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,128 - Modifying user hive
>     2015-02-14 11:42:18,149 - u"User['oozie']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2015-02-14 11:42:18,149 - Modifying user oozie
>     2015-02-14 11:42:18,170 - u"User['nobody']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'nobody']}
>     2015-02-14 11:42:18,170 - Modifying user nobody
>     2015-02-14 11:42:18,191 - u"User['ambari-qa']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2015-02-14 11:42:18,191 - Modifying user ambari-qa
>     2015-02-14 11:42:18,209 - u"User['hdfs']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,209 - Modifying user hdfs
>     2015-02-14 11:42:18,229 - u"User['storm']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,229 - Modifying user storm
>     2015-02-14 11:42:18,250 - u"User['mapred']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,250 - Modifying user mapred
>     2015-02-14 11:42:18,270 - u"User['hbase']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,270 - Modifying user hbase
>     2015-02-14 11:42:18,290 - u"User['tez']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'users']}
>     2015-02-14 11:42:18,291 - Modifying user tez
>     2015-02-14 11:42:18,311 - u"User['zookeeper']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,311 - Modifying user zookeeper
>     2015-02-14 11:42:18,330 - u"User['falcon']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,331 - Modifying user falcon
>     2015-02-14 11:42:18,351 - u"User['sqoop']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,351 - Modifying user sqoop
>     2015-02-14 11:42:18,364 - u"User['yarn']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,365 - Modifying user yarn
>     2015-02-14 11:42:18,385 - u"User['hcat']" {'gid': 'hadoop', 
> 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-02-14 11:42:18,385 - Modifying user hcat
>     2015-02-14 11:42:18,403 - 
> u"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']" {'content': 
> StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2015-02-14 11:42:18,421 - 
> u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa 
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']"
>  {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>     2015-02-14 11:42:18,427 - Skipping 
> u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa 
> /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']"
>  due to not_if
>     2015-02-14 11:42:18,428 - u"Directory['/grid/0/hadoop/hbase']" {'owner': 
> 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
>     2015-02-14 11:42:18,490 - 
> u"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']" {'content': 
> StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2015-02-14 11:42:18,506 - 
> u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase 
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase']" 
> {'not_if': 'test $(id -u hbase) -gt 1000'}
>     2015-02-14 11:42:18,513 - Skipping 
> u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase 
> /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase']" 
> due to not_if
>     2015-02-14 11:42:18,514 - u"Group['hdfs']" {'ignore_failures': False}
>     2015-02-14 11:42:18,515 - Modifying group hdfs
>     2015-02-14 11:42:18,618 - u"User['hdfs']" {'groups': [u'hadoop', 'hdfs', 
> 'hadoop', u'hdfs']}
>     2015-02-14 11:42:18,618 - Modifying user hdfs
>     2015-02-14 11:42:18,635 - u"Directory['/etc/hadoop']" {'mode': 0755}
>     2015-02-14 11:42:18,635 - u"Directory['/etc/hadoop/conf.empty']" 
> {'owner': 'hdfs', 'group': 'hadoop', 'recursive': True}
>     2015-02-14 11:42:18,636 - u"Link['/etc/hadoop/conf']" {'not_if': 'ls 
> /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
>     2015-02-14 11:42:18,643 - Skipping u"Link['/etc/hadoop/conf']" due to 
> not_if
>     2015-02-14 11:42:18,668 - u"File['/etc/hadoop/conf/hadoop-env.sh']" 
> {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
>     2015-02-14 11:42:18,695 - u"Execute['('setenforce', '0')']" {'sudo': 
> True, 'only_if': 'test -f /selinux/enforce'}
>     2015-02-14 11:42:18,719 - Skipping u"Execute['('setenforce', '0')']" due 
> to only_if
>     2015-02-14 11:42:18,720 - u"Directory['/grid/0/log/hadoop']" {'owner': 
> 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, 'cd_access': 'a'}
>     2015-02-14 11:42:18,781 - u"Directory['/var/run/hadoop']" {'owner': 
> 'root', 'group': 'root', 'recursive': True, 'cd_access': 'a'}
>     2015-02-14 11:42:18,830 - u"Directory['/tmp/hadoop-hdfs']" {'owner': 
> 'hdfs', 'recursive': True, 'cd_access': 'a'}
>     2015-02-14 11:42:18,861 - 
> u"File['/etc/hadoop/conf/commons-logging.properties']" {'content': 
> Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
>     2015-02-14 11:42:18,873 - u"File['/etc/hadoop/conf/health_check']" 
> {'content': Template('health_check-v2.j2'), 'owner': 'hdfs'}
>     2015-02-14 11:42:18,889 - u"File['/etc/hadoop/conf/log4j.properties']" 
> {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
>     2015-02-14 11:42:18,920 - 
> u"File['/etc/hadoop/conf/hadoop-metrics2.properties']" {'content': 
> Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
>     2015-02-14 11:42:18,936 - 
> u"File['/etc/hadoop/conf/task-log4j.properties']" {'content': 
> StaticFile('task-log4j.properties'), 'mode': 0755}
>     2015-02-14 11:42:18,951 - u"File['/etc/hadoop/conf/configuration.xsl']" 
> {'owner': 'hdfs', 'group': 'hadoop'}
>     2015-02-14 11:42:19,125 - u"HdfsDirectory['/user/oozie']" 
> {'security_enabled': False, 'keytab': [EMPTY], 'conf_dir': 
> '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '', 'mode': 
> 0775, 'owner': 'oozie', 'bin_dir': '/usr/bin', 'action': ['create']}
>     2015-02-14 11:42:19,127 - u"Execute['hadoop --config /etc/hadoop/conf fs 
> -mkdir -p /user/oozie && hadoop --config /etc/hadoop/conf fs -chmod  775 
> /user/oozie && hadoop --config /etc/hadoop/conf fs -chown  oozie 
> /user/oozie']" {'not_if': "/usr/bin/sudo su hdfs -l -s /bin/bash -c 'hadoop 
> --config /etc/hadoop/conf fs -ls /user/oozie'", 'user': 'hdfs', 'path': 
> ['/usr/bin']}
>     2015-02-14 11:42:21,829 - Skipping u"Execute['hadoop --config 
> /etc/hadoop/conf fs -mkdir -p /user/oozie && hadoop --config /etc/hadoop/conf 
> fs -chmod  775 /user/oozie && hadoop --config /etc/hadoop/conf fs -chown  
> oozie /user/oozie']" due to not_if
>     2015-02-14 11:42:21,830 - u"Directory['/etc/oozie/conf']" {'owner': 
> 'oozie', 'group': 'hadoop', 'recursive': True}
>     2015-02-14 11:42:21,832 - u"XmlConfig['oozie-site.xml']" {'group': 
> 'hadoop', 'conf_dir': '/etc/oozie/conf', 'mode': 0664, 
> 'configuration_attributes': {}, 'owner': 'oozie', 'configurations': ...}
>     2015-02-14 11:42:21,851 - Generating config: 
> /etc/oozie/conf/oozie-site.xml
>     2015-02-14 11:42:21,851 - u"File['/etc/oozie/conf/oozie-site.xml']" 
> {'owner': 'oozie', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 
> 0664, 'encoding': 'UTF-8'}
>     2015-02-14 11:42:21,909 - Writing 
> u"File['/etc/oozie/conf/oozie-site.xml']" because contents don't match
>     2015-02-14 11:42:21,937 - Changing permission for 
> /etc/oozie/conf/oozie-site.xml from 644 to 664
>     2015-02-14 11:42:21,962 - u"File['/etc/oozie/conf/oozie-env.sh']" 
> {'content': InlineTemplate(...), 'owner': 'oozie'}
>     2015-02-14 11:42:21,980 - 
> u"File['/etc/oozie/conf/oozie-log4j.properties']" {'content': '...', 'owner': 
> 'oozie', 'group': 'hadoop', 'mode': 0644}
>     2015-02-14 11:42:22,000 - u"File['/etc/oozie/conf/adminusers.txt']" 
> {'owner': 'oozie', 'group': 'hadoop'}
>     2015-02-14 11:42:22,001 - 
> u"File['/usr/lib/ambari-agent/DBConnectionVerification.jar']" {'content': 
> DownloadSource('http://amb-upg151-rhel6postgres1423903811-10.cs1cloud.internal:8080/resources/DBConnectionVerification.jar')}
>     2015-02-14 11:42:22,002 - Not downloading the file from 
> http://amb-upg151-rhel6postgres1423903811-10.cs1cloud.internal:8080/resources/DBConnectionVerification.jar,
>  because /var/lib/ambari-agent/data/tmp/DBConnectionVerification.jar already 
> exists
>     2015-02-14 11:42:22,014 - u"File['/etc/oozie/conf/hadoop-config.xml']" 
> {'owner': 'oozie', 'group': 'hadoop'}
>     2015-02-14 11:42:22,015 - u"File['/etc/oozie/conf/oozie-default.xml']" 
> {'owner': 'oozie', 'group': 'hadoop'}
>     2015-02-14 11:42:22,015 - u"Directory['/etc/oozie/conf/action-conf']" 
> {'owner': 'oozie', 'group': 'hadoop'}
>     2015-02-14 11:42:22,016 - u"File['/etc/oozie/conf/action-conf/hive.xml']" 
> {'owner': 'oozie', 'group': 'hadoop'}
>     2015-02-14 11:42:22,017 - u"File['/var/run/oozie/oozie.pid']" {'action': 
> ['delete'], 'not_if': 'ls {pid_file} >/dev/null 2>&1 && !(ps `cat {pid_file}` 
> >/dev/null 2>&1)'}
>     2015-02-14 11:42:22,022 - u"Directory['/usr/lib/oozie//var/tmp/oozie']" 
> {'owner': 'oozie', 'cd_access': 'a', 'group': 'hadoop', 'recursive': True, 
> 'mode': 0755}
>     2015-02-14 11:42:22,160 - u"Directory['/var/run/oozie']" {'owner': 
> 'oozie', 'cd_access': 'a', 'group': 'hadoop', 'recursive': True, 'mode': 0755}
>     2015-02-14 11:42:22,200 - u"Directory['/grid/0/log/oozie']" {'owner': 
> 'oozie', 'cd_access': 'a', 'group': 'hadoop', 'recursive': True, 'mode': 0755}
>     2015-02-14 11:42:22,270 - u"Directory['/var/tmp/oozie']" {'owner': 
> 'oozie', 'cd_access': 'a', 'group': 'hadoop', 'recursive': True, 'mode': 0755}
>     2015-02-14 11:42:22,355 - u"Directory['/grid/0/hadoop/oozie/data']" 
> {'owner': 'oozie', 'cd_access': 'a', 'group': 'hadoop', 'recursive': True, 
> 'mode': 0755}
>     2015-02-14 11:42:22,469 - u"Directory['/var/lib/oozie/']" {'owner': 
> 'oozie', 'cd_access': 'a', 'group': 'hadoop', 'recursive': True, 'mode': 0755}
>     2015-02-14 11:42:22,511 - 
> u"Directory['/var/lib/oozie/oozie-server/webapps/']" {'owner': 'oozie', 
> 'cd_access': 'a', 'group': 'hadoop', 'recursive': True, 'mode': 0755}
>     2015-02-14 11:42:22,577 - 
> u"Directory['/var/lib/oozie/oozie-server/conf']" {'owner': 'oozie', 
> 'cd_access': 'a', 'group': 'hadoop', 'recursive': True, 'mode': 0755}
>     2015-02-14 11:42:22,638 - u"Directory['/var/lib/oozie/oozie-server']" 
> {'owner': 'oozie', 'recursive': True, 'group': 'hadoop', 'mode': 0755, 
> 'cd_access': 'a'}
>     2015-02-14 11:42:22,680 - u"Directory['/usr/lib/oozie/libext']" 
> {'recursive': True}
>     2015-02-14 11:42:22,682 - u"Execute['('tar', '-xvf', 
> '/usr/lib/oozie/oozie-sharelib.tar.gz', '-C', '/usr/lib/oozie')']" {'not_if': 
> 'ls /var/run/oozie/oozie.pid >/dev/null 2>&1 && ps -p `cat 
> /var/run/oozie/oozie.pid` >/dev/null 2>&1', 'sudo': True}
>     2015-02-14 11:42:25,221 - u"Execute['('cp', 
> '/usr/share/HDP-oozie/ext-2.2.zip', '/usr/lib/oozie/libext')']" {'not_if': 
> 'ls /var/run/oozie/oozie.pid >/dev/null 2>&1 && ps -p `cat 
> /var/run/oozie/oozie.pid` >/dev/null 2>&1', 'sudo': True}
>     2015-02-14 11:42:25,244 - u"Execute['('chown', u'oozie:hadoop', 
> '/usr/lib/oozie/libext/ext-2.2.zip')']" {'not_if': 'ls 
> /var/run/oozie/oozie.pid >/dev/null 2>&1 && ps -p `cat 
> /var/run/oozie/oozie.pid` >/dev/null 2>&1', 'sudo': True}
>     2015-02-14 11:42:25,256 - u"Execute['('chown', '-RL', u'oozie:hadoop', 
> '/var/lib/oozie/oozie-server/conf')']" {'not_if': 'ls 
> /var/run/oozie/oozie.pid >/dev/null 2>&1 && ps -p `cat 
> /var/run/oozie/oozie.pid` >/dev/null 2>&1', 'sudo': True}
>     2015-02-14 11:42:25,268 - 
> u"File['/var/lib/ambari-agent/data/tmp/mysql-connector-java.jar']" 
> {'content': 
> DownloadSource('http://amb-upg151-rhel6postgres1423903811-10.cs1cloud.internal:8080/resources//mysql-jdbc-driver.jar')}
>     2015-02-14 11:42:25,268 - Downloading the file from 
> http://amb-upg151-rhel6postgres1423903811-10.cs1cloud.internal:8080/resources//mysql-jdbc-driver.jar
>     
> **Expected result:**   
> All services are started.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to