Hadi Sinaee created AMBARI-14106:
------------------------------------
Summary: Ambari 1.7 - Datanode failing due to yum problem
Key: AMBARI-14106
URL: https://issues.apache.org/jira/browse/AMBARI-14106
Project: Ambari
Issue Type: Bug
Reporter: Hadi Sinaee
In the phase of Install, Start and Test I will get this error. Everything goes
well in the previous steps. The error Log is as follows:
2015-11-28 19:08:12,983 - Error while executing command 'install':
Traceback (most recent call last):
File
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 123, in execute
method(env)
File
"/var/lib/ambari-agent/cache/stacks/PHD/2.0.6/services/HDFS/package/scripts/datanode.py",
line 29, in install
self.install_packages(env, params.exclude_packages)
File
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 188, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",
line 148, in __init__
self.env.run()
File
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 149, in run
self.run_action(resource, action)
File
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 115, in run_action
provider_action()
File
"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
line 40, in action_install
self.install_package(package_name)
File
"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py",
line 36, in install_package
shell.checked_call(cmd)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
line 36, in checked_call
return _call(command, logoutput, True, cwd, env, preexec_fn, user,
wait_for_finish, timeout, path)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
line 102, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_3_0_*' returned 1.
Error: Package: zookeeper_3_0_0_0_249-3.4.6.3.0.0.0-249.noarch (PHD-3.0)
Requires: update-alternatives
Error: Package: hadoop_3_0_0_0_249-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: insserv
Error: Package: hadoop_3_0_0_0_249-hdfs-fuse-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: libfuse2
Error: Package: hadoop_3_0_0_0_249-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: netcat-openbsd
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
stdout: /var/lib/ambari-agent/data/output-61.txt
2015-11-28 19:08:10,085 - Could not verify stack version by calling
'/usr/bin/distro-select versions > /tmp/tmp0W3_aO'. Return Code: 1, Output: .
2015-11-28 19:08:10,090 - Execute['mkdir -p
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10
http://ambari.localdomain:8080/resources//UnlimitedJCEPolicyJDK7.zip -o
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip']
{'environment': ..., 'not_if': 'test -e
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip',
'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2015-11-28 19:08:10,097 - Skipping Execute['mkdir -p
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10
http://ambari.localdomain:8080/resources//UnlimitedJCEPolicyJDK7.zip -o
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip']
due to not_if
2015-11-28 19:08:10,097 - Group['hadoop'] {'ignore_failures': False}
2015-11-28 19:08:10,098 - Modifying group hadoop
2015-11-28 19:08:10,144 - Group['users'] {'ignore_failures': False}
2015-11-28 19:08:10,144 - Modifying group users
2015-11-28 19:08:10,194 - User['ambari-qa'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'users']}
2015-11-28 19:08:10,194 - Modifying user ambari-qa
2015-11-28 19:08:10,203 - User['zookeeper'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-11-28 19:08:10,204 - Modifying user zookeeper
2015-11-28 19:08:10,212 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures':
False, 'groups': [u'hadoop']}
2015-11-28 19:08:10,212 - Modifying user hdfs
2015-11-28 19:08:10,227 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-11-28 19:08:10,228 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-11-28 19:08:10,237 - Skipping
Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] due to not_if
2015-11-28 19:08:10,237 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root',
'group': 'root', 'recursive': True}
2015-11-28 19:08:10,238 - Link['/etc/hadoop/conf'] {'not_if': 'ls
/etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-11-28 19:08:10,245 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-11-28 19:08:10,260 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content':
InlineTemplate(...), 'owner': 'hdfs'}
2015-11-28 19:08:10,274 - Repository['PHD-3.0'] {'base_url':
'http://ambari.localdomain/PHD-3.0.0.0', 'action': ['create'], 'components':
[u'PHD', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name':
'PHD', 'mirror_list': None}
2015-11-28 19:08:10,280 - File['/etc/yum.repos.d/PHD.repo'] {'content':
Template('repo_suse_rhel.j2')}
2015-11-28 19:08:10,281 - Repository['PHD-UTILS-1.1.0.20'] {'base_url':
'http://ambari.localdomain/PHD-UTILS-1.1.0.20', 'action': ['create'],
'components': [u'PHD-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2',
'repo_file_name': 'PHD-UTILS', 'mirror_list': None}
2015-11-28 19:08:10,284 - File['/etc/yum.repos.d/PHD-UTILS.repo'] {'content':
Template('repo_suse_rhel.j2')}
2015-11-28 19:08:10,285 - Repository['PADS-1.3.0.0'] {'base_url':
'http://ambari.localdomain/PADS-1.3.0.0', 'action': ['create'], 'components':
[u'PADS-1.3.0.0', 'main'], 'repo_template': 'repo_suse_rhel.j2',
'repo_file_name': 'PADS-1.3.0.0', 'mirror_list': None}
2015-11-28 19:08:10,289 - File['/etc/yum.repos.d/PADS-1.3.0.0.repo']
{'content': Template('repo_suse_rhel.j2')}
2015-11-28 19:08:10,289 - Package['unzip'] {}
2015-11-28 19:08:10,491 - Skipping installing existent package unzip
2015-11-28 19:08:10,491 - Package['curl'] {}
2015-11-28 19:08:10,699 - Skipping installing existent package curl
2015-11-28 19:08:10,699 - Package['distro-select'] {}
2015-11-28 19:08:10,916 - Skipping installing existent package distro-select
2015-11-28 19:08:10,917 - Execute['mkdir -p
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; curl -kf -x "" --retry
10 http://ambari.localdomain:8080/resources//jdk-7u67-linux-x64.tar.gz -o
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz']
{'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java',
'path': ['/bin', '/usr/bin/']}
2015-11-28 19:08:10,925 - Skipping Execute['mkdir -p
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; curl -kf -x "" --retry
10 http://ambari.localdomain:8080/resources//jdk-7u67-linux-x64.tar.gz -o
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz']
due to not_if
2015-11-28 19:08:10,926 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar
-xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz
> /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java',
'path': ['/bin', '/usr/bin/']}
2015-11-28 19:08:10,933 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64
; tar -xf
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz >
/dev/null 2>&1'] due to not_if
2015-11-28 19:08:11,100 - Could not verify stack version by calling
'/usr/bin/distro-select versions > /tmp/tmpozoMLn'. Return Code: 1, Output: .
2015-11-28 19:08:11,107 - Package['hadoop_3_0_*'] {}
2015-11-28 19:08:11,321 - Installing package hadoop_3_0_* ('/usr/bin/yum -d 0
-e 0 -y install hadoop_3_0_*')
2015-11-28 19:08:12,983 - Error while executing command 'install':
Traceback (most recent call last):
File
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 123, in execute
method(env)
File
"/var/lib/ambari-agent/cache/stacks/PHD/2.0.6/services/HDFS/package/scripts/datanode.py",
line 29, in install
self.install_packages(env, params.exclude_packages)
File
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 188, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py",
line 148, in __init__
self.env.run()
File
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 149, in run
self.run_action(resource, action)
File
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 115, in run_action
provider_action()
File
"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py",
line 40, in action_install
self.install_package(package_name)
File
"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py",
line 36, in install_package
shell.checked_call(cmd)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
line 36, in checked_call
return _call(command, logoutput, True, cwd, env, preexec_fn, user,
wait_for_finish, timeout, path)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
line 102, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_3_0_*' returned 1.
Error: Package: zookeeper_3_0_0_0_249-3.4.6.3.0.0.0-249.noarch (PHD-3.0)
Requires: update-alternatives
Error: Package: hadoop_3_0_0_0_249-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: insserv
Error: Package: hadoop_3_0_0_0_249-hdfs-fuse-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: libfuse2
Error: Package: hadoop_3_0_0_0_249-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: netcat-openbsd
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)