Hi Bernd,

Got warning, Discovering versions from the identity service failed when 
creating the password plugin. Attempting to determine version from URL.

When we set up OpenStack manually without Devstack, we go the above warning. We 
fixed by setting up environment variables.

I don't know the below error/s log is coming due to the above warning or not. 
When we use Devstack for setting up Ocata, stack.sh has environment variables.  
After running stack.sh when we echo the environment variables(exported thru the 
stack.sh), not able to echo. We suspect that environment variables are not set 
properly.

Could you please help us to resolve the issues.

Find below the detailed log:

2017-12-01 02:50:04.646 DEBUG nova.scheduler.client.report 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] 
/opt/stack/nova/nova/scheduler/client/report.py --> update_resource_stat from 
(pid=4833) update_resource_stats 
/opt/stack/nova/nova/scheduler/client/report.py:633
2017-12-01 02:50:04.647 DEBUG oslo_messaging._drivers.amqpdriver 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] CALL msg_id: 
272c4a91012245b08b5f95a6a557ef63 exchange 'nova' topic 'conductor' from 
(pid=4833) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562
2017-12-01 02:50:04.669 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 272c4a91012245b08b5f95a6a557ef63 from (pid=4833) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419
2017-12-01 02:50:04.670 DEBUG nova.scheduler.client.report 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] _ensure_resource_provider 
--> uuid ::  6deda2bf-68ff-470f-b23f-392ca1e2b9b5 from (pid=4833) 
_ensure_resource_provider /opt/stack/nova/nova/scheduler/client/report.py:393
2017-12-01 02:50:04.670 ERROR keystoneauth1.session 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] url ---> 
/resource_providers/6deda2bf-68ff-470f-b23f-392ca1e2b9b5
2017-12-01 02:50:04.670 DEBUG keystoneauth1.session 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] url ---> 
/resource_providers/6deda2bf-68ff-470f-b23f-392ca1e2b9b5 from (pid=4833) get 
/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py:759
2017-12-01 02:50:04.670 ERROR keystoneauth1.session 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] url ---> 
http://10.105.166.213:35357/v3
2017-12-01 02:50:04.671 DEBUG keystoneauth1.session 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] url ---> 
http://10.105.166.213:35357/v3 from (pid=4833) get 
/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py:759
2017-12-01 02:51:19.669 WARNING keystoneauth.identity.generic.base 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] Discovering versions from 
the identity service failed when creating the password plugin. Attempting to 
determine version from URL.
2017-12-01 02:51:19.670 DEBUG keystoneauth1.identity.v3.base 
[req-2d4bcbcb-f0b3-44dc-9659-dd781f97c7cc None None] 
http://10.105.166.213:35357/v3/auth/tokens from (pid=4833) get_auth_ref 
/usr/local/lib/python2.7/dist-packages/keystoneauth1/identity/v3/base.py:164



From: Ramu, MohanX
Sent: Thursday, November 30, 2017 9:03 PM
To: Bernd Bausch <berndbau...@gmail.com>; openstack@lists.openstack.org
Subject: RE: [Openstack] Devstack -Ocata - Failed to launch instance on Host




Hi Bernd





When we add logs to find the URL: got below logs, Still the issue is not 
resolved.



017-11-30 07:14:56.942 ERROR keystoneauth1.session 
[req-3d55cc14-8ff1-4c26-b58e-64cd7e128bb2 None None]

url ---> /resource_providers/62041017-580c-449c-937d-edd634b40d23

2017-11-30 07:14:56.942 DEBUG keystoneauth1.session 
[req-3d55cc14-8ff1-4c26-b58e-64cd7e128bb2 None None]

url ---> /resource_providers/62041017-580c-449c-937d-edd634b40d23



---> http://10.105.166.213:35357/v3/auth/tokens from (pid=16436) get_auth_ref 
/usr/local/lib/python2.7/dist-packages/keystoneauth1/identity/v3/base.py:164





found below error in n-cpu logs.





' from (pid=28861) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:00.158 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 2d9d2c01cec0498b9d580b7abab4621e from (pid=28861) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:05.825 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: e29f30b9952c4318874f768d03b4d8d8 exchange 'nova' topic 'conductor' from 
(pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:05.846 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: e29f30b9952c4318874f768d03b4d8d8 from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:07.511 DEBUG oslo_concurrency.lockutils 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Lock "compute_resources" 
released by "nova.compute.resource_tracker._update_available_resource" :: held 
75.056s from (pid=24675) inner 
/usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:282

2017-11-29 22:48:07.511 ERROR nova.compute.manager 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Error updating resources 
for node ubuntu43.

2017-11-29 22:48:07.511 TRACE nova.compute.manager Traceback (most recent call 
last):

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/compute/manager.py", line 6582, in 
update_available_resource_for_node

2017-11-29 22:48:07.511 TRACE nova.compute.manager     
rt.update_available_resource(context, nodename)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/compute/resource_tracker.py", line 558, in 
update_available_resource

2017-11-29 22:48:07.511 TRACE nova.compute.manager     
self._update_available_resource(context, resources)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py", line 
271, in inner

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return f(*args, **kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/compute/resource_tracker.py", line 582, in 
_update_available_resource

2017-11-29 22:48:07.511 TRACE nova.compute.manager     
self._init_compute_node(context, resources)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/compute/resource_tracker.py", line 445, in 
_init_compute_node

2017-11-29 22:48:07.511 TRACE nova.compute.manager     
self.scheduler_client.update_resource_stats(cn)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/scheduler/client/__init__.py", line 60, in 
update_resource_stats

2017-11-29 22:48:07.511 TRACE nova.compute.manager     
self.reportclient.update_resource_stats(compute_node)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/scheduler/client/__init__.py", line 37, in __run_method

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return 
getattr(self.instance, __name)(*args, **kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/scheduler/client/report.py", line 631, in 
update_resource_stats

2017-11-29 22:48:07.511 TRACE nova.compute.manager     
compute_node.hypervisor_hostname)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/scheduler/client/report.py", line 407, in 
_ensure_resource_provider

2017-11-29 22:48:07.511 TRACE nova.compute.manager     rp = 
self._get_resource_provider(uuid)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/scheduler/client/report.py", line 55, in wrapper

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return f(self, *a, **k)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/scheduler/client/report.py", line 309, in 
_get_resource_provider

2017-11-29 22:48:07.511 TRACE nova.compute.manager     resp = 
self.get("/resource_providers/%s" % uuid)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/opt/stack/nova/nova/scheduler/client/report.py", line 209, in get

2017-11-29 22:48:07.511 TRACE nova.compute.manager     
endpoint_filter=self.ks_filter, raise_exc=False, **kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py", line 758, in 
get

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return self.request(url, 
'GET', **kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/positional/__init__.py", line 101, in 
inner

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return wrapped(*args, 
**kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py", line 491, in 
request

2017-11-29 22:48:07.511 TRACE nova.compute.manager     auth_headers = 
self.get_auth_headers(auth)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py", line 818, in 
get_auth_headers

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return 
auth.get_headers(self, **kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/plugin.py", line 90, in 
get_headers

2017-11-29 22:48:07.511 TRACE nova.compute.manager     token = 
self.get_token(session)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/identity/base.py", line 
90, in get_token

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return 
self.get_access(session).auth_token

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/identity/base.py", line 
136, in get_access

2017-11-29 22:48:07.511 TRACE nova.compute.manager     self.auth_ref = 
self.get_auth_ref(session)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/identity/generic/base.py",
 line 198, in get_auth_ref

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return 
self._plugin.get_auth_ref(session, **kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/identity/v3/base.py", 
line 167, in get_auth_ref

2017-11-29 22:48:07.511 TRACE nova.compute.manager     authenticated=False, 
log=False, **rkwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py", line 766, in 
post

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return self.request(url, 
'POST', **kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/positional/__init__.py", line 101, in 
inner

2017-11-29 22:48:07.511 TRACE nova.compute.manager     return wrapped(*args, 
**kwargs)

2017-11-29 22:48:07.511 TRACE nova.compute.manager   File 
"/usr/local/lib/python2.7/dist-packages/keystoneauth1/session.py", line 655, in 
request

2017-11-29 22:48:07.511 TRACE nova.compute.manager     raise 
exceptions.from_response(resp, method, url)

2017-11-29 22:48:07.511 TRACE nova.compute.manager ServiceUnavailable: Service 
Unavailable (HTTP 503)

2017-11-29 22:48:07.511 TRACE nova.compute.manager

2017-11-29 22:48:07.512 DEBUG oslo_service.periodic_task 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Running periodic task 
ComputeManager._poll_unconfirmed_resizes from (pid=24675) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:07.512 DEBUG oslo_service.periodic_task 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Running periodic task 
ComputeManager._sync_scheduler_instance_info from (pid=24675) 
run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:07.513 DEBUG oslo_messaging._drivers.amqpdriver 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] CALL msg_id: 
7d8545666cfc4dd99de33d4268820f81 exchange 'nova' topic 'conductor' from 
(pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:07.534 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 7d8545666cfc4dd99de33d4268820f81 from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:07.535 DEBUG oslo_messaging._drivers.amqpdriver 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] CAST unique_id: 
0c0b0fe866a241498a7953ffdb4ab0b2 FANOUT topic 'scheduler' from (pid=24675) 
_send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:551

2017-11-29 22:48:07.536 DEBUG oslo_service.periodic_task 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Running periodic task 
ComputeManager._poll_rescued_instances from (pid=24675) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:07.536 DEBUG oslo_service.periodic_task 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Running periodic task 
ComputeManager._check_instance_build_time from (pid=24675) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:07.537 DEBUG oslo_service.periodic_task 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Running periodic task 
ComputeManager._run_image_cache_manager_pass from (pid=24675) 
run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:07.537 DEBUG oslo_concurrency.lockutils 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Lock 
"storage-registry-lock" acquired by 
"nova.virt.storage_users.do_register_storage_use" :: waited 0.000s from 
(pid=24675) inner 
/usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:270

2017-11-29 22:48:07.537 DEBUG oslo_concurrency.lockutils 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Lock 
"storage-registry-lock" released by 
"nova.virt.storage_users.do_register_storage_use" :: held 0.000s from 
(pid=24675) inner 
/usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:282

2017-11-29 22:48:07.538 DEBUG oslo_concurrency.lockutils 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Lock 
"storage-registry-lock" acquired by 
"nova.virt.storage_users.do_get_storage_users" :: waited 0.000s from 
(pid=24675) inner 
/usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:270

2017-11-29 22:48:07.538 DEBUG oslo_concurrency.lockutils 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Lock 
"storage-registry-lock" released by 
"nova.virt.storage_users.do_get_storage_users" :: held 0.000s from (pid=24675) 
inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:282

2017-11-29 22:48:07.538 DEBUG oslo_messaging._drivers.amqpdriver 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] CALL msg_id: 
d3eaf9f5de2a44deb2dc91a7747df8e9 exchange 'nova' topic 'conductor' from 
(pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:07.554 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: d3eaf9f5de2a44deb2dc91a7747df8e9 from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:07.554 DEBUG nova.virt.libvirt.imagecache 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Skipping verification, no 
base directory at /opt/stack/data/nova/instances/_base from (pid=24675) 
_get_base /opt/stack/nova/nova/virt/libvirt/imagecache.py:448

2017-11-292017-11-29 22:48:10.141 DEBUG oslo_messaging._drivers.amqpdriver [-] 
CALL msg_id: 1940cfbd516441feb9461c9c7fa09c28 exchange 'nova' topic 'conductor' 
from (pid=28861) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:10.165 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 1940cfbd516441feb9461c9c7fa09c28 from (pid=28861) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

22:48:07.555 DEBUG oslo_service.periodic_task 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Running periodic task 
ComputeManager._heal_instance_info_cache from (pid=24675) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:07.555 DEBUG nova.compute.manager 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Starting heal instance 
info cache from (pid=24675) _heal_instance_info_cache 
/opt/stack/nova/nova/compute/manager.py:5876

2017-11-29 22:48:07.555 DEBUG nova.compute.manager 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Rebuilding the list of 
instances to heal from (pid=24675) _heal_instance_info_cache 
/opt/stack/nova/nova/compute/manager.py:5880

2017-11-29 22:48:07.555 DEBUG oslo_messaging._drivers.amqpdriver 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] CALL msg_id: 
6887192edec0421888c8e73ddb0bd9be exchange 'nova' topic 'conductor' from 
(pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:07.575 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 6887192edec0421888c8e73ddb0bd9be from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:07.575 DEBUG nova.compute.manager 
[req-6f2e55e0-6b80-41ad-9c23-e333d75c1ec6 None None] Didn't find any instances 
for network info cache update. from (pid=24675) _heal_instance_info_cache 
/opt/stack/nova/nova/compute/manager.py:5952

2017-11-29 22:48:15.827 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: e2a0e95b62ca41a4a81cb82dd0621955 exchange 'nova' topic 'conductor' from 
(pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:15.843 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: e2a0e95b62ca41a4a81cb82dd0621955 from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:20.138 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: a33f9a32f7424e8c99a0b04ae8a3fb5a exchange 'nova' topic 'conductor' from 
(pid=28861) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:20.165 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: a33f9a32f7424e8c99a0b04ae8a3fb5a from (pid=28861) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:25.827 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: a50027a40aca4513bdbb5af5b96745f8 exchange 'nova' topic 'conductor' from 
(pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:25.849 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: a50027a40aca4513bdbb5af5b96745f8 from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:30.138 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: 422c7ca5c85e4b5eb5a1931113424ae5 exchange 'nova' topic 'conductor' from 
(pid=28861) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:30.159 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 422c7ca5c85e4b5eb5a1931113424ae5 from (pid=28861) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:35.830 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: 636d7ce67353413d98e9ade8874bfc05 exchange 'nova' topic 'conductor' from 
(pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:35.850 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 636d7ce67353413d98e9ade8874bfc05 from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:40.141 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: 3ed1de2deede4b3fa178682def287710 exchange 'nova' topic 'conductor' from 
(pid=28861) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:40.167 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 3ed1de2deede4b3fa178682def287710 from (pid=28861) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:44.404 DEBUG oslo_service.periodic_task 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Running periodic task 
ComputeManager._instance_usage_audit from (pid=28861) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:44.405 DEBUG oslo_service.periodic_task 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Running periodic task 
ComputeManager._poll_rebooting_instances from (pid=28861) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:44.405 DEBUG oslo_service.periodic_task 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Running periodic task 
ComputeManager._poll_volume_usage from (pid=28861) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:44.406 DEBUG oslo_service.periodic_task 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Running periodic task 
ComputeManager._reclaim_queued_deletes from (pid=28861) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:44.406 DEBUG nova.compute.manager 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] 
CONF.reclaim_instance_interval <= 0, skipping... from (pid=28861) 
_reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:6548

2017-11-29 22:48:44.407 DEBUG oslo_service.periodic_task 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Running periodic task 
ComputeManager.update_available_resource from (pid=28861) run_periodic_tasks 
/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

2017-11-29 22:48:44.407 DEBUG oslo_messaging._drivers.amqpdriver 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] CALL msg_id: 
5445875ac88b43b29f492dd2ce20dc48 exchange 'nova' topic 'conductor' from 
(pid=28861) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:44.431 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 5445875ac88b43b29f492dd2ce20dc48 from (pid=28861) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:44.433 DEBUG nova.compute.resource_tracker 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Auditing locally available 
compute resources for ubuntu43 (node: ubuntu43) from (pid=28861) 
update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:541

2017-11-29 22:48:44.463 DEBUG nova.compute.resource_tracker 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Hypervisor: free VCPUs: 8 
from (pid=28861) _report_hypervisor_resource_view 
/opt/stack/nova/nova/compute/resource_tracker.py:662

2017-11-29 22:48:44.464 DEBUG nova.compute.resource_tracker 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Hypervisor/Node resource 
view: name=ubuntu43 free_ram=30714MB free_disk=184GB free_vcpus=8 
pci_devices=[{"dev_id": "pci_0000_00_00_0", "product_id": "1618", "dev_type": 
"type-PCI", "numa_node": null, "vendor_id": "8086", "label": "label_8086_1618", 
"address": "0000:00:00.0"}, {"dev_id": "pci_0000_00_14_0", "product_id": 
"8c31", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", 
"label": "label_8086_8c31", "address": "0000:00:14.0"}, {"dev_id": 
"pci_0000_00_16_0", "product_id": "8c3a", "dev_type": "type-PCI", "numa_node": 
null, "vendor_id": "8086", "label": "label_8086_8c3a", "address": 
"0000:00:16.0"}, {"dev_id": "pci_0000_00_16_1", "product_id": "8c3b", 
"dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": 
"label_8086_8c3b", "address": "0000:00:16.1"}, {"dev_id": "pci_0000_00_1a_0", 
"product_id": 2017-11-29 22:48:45.831 DEBUG oslo_messaging._drivers.amqpdriver 
[-] CALL msg_id: 5af0290dcc3940359dc6b131609685a2 exchange 'nova' topic 
'conductor' from (pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:45.850 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 5af0290dcc3940359dc6b131609685a2 from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

"8c2d", "dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", 
"label": "label_8086_8c2d", "address": "0000:00:1a.0"}, {"dev_id": 
"pci_0000_00_1c_0", "product_id": "8c10", "dev_type": "type-PCI", "numa_node": 
null, "vendor_id": "8086", "label": "label_8086_8c10", "address": 
"0000:00:1c.0"}, {"dev_id": "pci_0000_01_00_0", "product_id": "1150", 
"dev_type": "type-PCI", "numa_node": null, "vendor_id": "1a03", "label": 
"label_1a03_1150", "address": "0000:01:00.0"}, {"dev_id": "pci_0000_02_00_0", 
"product_id": "2000", "dev_type": "type-PCI", "numa_node": null, "vendor_id": 
"1a03", "label": "label_1a03_2000", "address": "0000:02:00.0"}, {"dev_id": 
"pci_0000_00_1c_2", "product_id": "8c14", "dev_type": "type-PCI", "numa_node": 
null, "vendor_id": "8086", "label": "label_8086_8c14", "address": 
"0000:00:1c.2"}, {"dev_id": "pci_0000_03_00_0", "product_id": "1533", 
"dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": 
"label_8086_1533", "address": "0000:03:00.0"}, {"dev_id": "pci_0000_00_1c_3", 
"product_id": "8c16", "dev_type": "type-PCI", "numa_node": null, "vendor_id": 
"8086", "label": "label_8086_8c16", "address": "0000:00:1c.3"}, {"dev_id": 
"pci_0000_04_00_0", "product_id": "1533", "dev_type": "type-PCI", "numa_node": 
null, "vendor_id": "8086", "label": "label_8086_1533", "address": 
"0000:04:00.0"}, {"dev_id": "pci_0000_00_1d_0", "product_id": "8c26", 
"dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": 
"label_8086_8c26", "address": "0000:00:1d.0"}, {"dev_id": "pci_0000_00_1f_0", 
"product_id": "8c56", "dev_type": "type-PCI", "numa_node": null, "vendor_id": 
"8086", "label": "label_8086_8c56", "address": "0000:00:1f.0"}, {"dev_id": 
"pci_0000_00_1f_2", "product_id": "8c02", "dev_type": "type-PCI", "numa_node": 
null, "vendor_id": "8086", "label": "label_8086_8c02", "address": 
"0000:00:1f.2"}, {"dev_id": "pci_0000_00_1f_3", "product_id": "8c22", 
"dev_type": "type-PCI", "numa_node": null, "vendor_id": "8086", "label": 
"label_8086_8c22", "address": "0000:00:1f.3"}, {"dev_id": "pci_0000_00_1f_6", 
"product_id": "8c24", "dev_type": "type-PCI", "numa_node": null, "vendor_id": 
"8086", "label": "label_8086_8c24", "address": "0000:00:1f.6"}] from 
(pid=28861) _report_hypervisor_resource_view 
/opt/stack/nova/nova/compute/resource_tracker.py:679

2017-11-29 22:48:44.464 DEBUG oslo_concurrency.lockutils 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] Lock "compute_resources" 
acquired by "nova.compute.resource_tracker._update_available_resource" :: 
waited 0.000s from (pid=28861) inner 
/usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:270

2017-11-29 22:48:44.465 DEBUG oslo_messaging._drivers.amqpdriver 
[req-2dcfd462-169e-4da6-b83d-e4893e0c566b None None] CALL msg_id: 
6c5aedb48e92409ba86e18bd9585790e exchange 'nova' topic 'conductor' from 
(pid=28861) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:44.491 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 6c5aedb48e92409ba86e18bd9585790e from (pid=28861) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:50.140 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: 9dc559af0607442dac94a10a1e883576 exchange 'nova' topic 'conductor' from 
(pid=28861) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:50.162 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 9dc559af0607442dac94a10a1e883576 from (pid=28861) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419

2017-11-29 22:48:55.831 DEBUG oslo_messaging._drivers.amqpdriver [-] CALL 
msg_id: 2e97d1a11dfa4dacbf251d7e1bd2e4ef exchange 'nova' topic 'conductor' from 
(pid=24675) _send 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:562

2017-11-29 22:48:55.850 DEBUG oslo_messaging._drivers.amqpdriver [-] received 
reply msg_id: 2e97d1a11dfa4dacbf251d7e1bd2e4ef from (pid=24675) __call__ 
/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:419





Thanks

Mohan Ramu





-----Original Message-----
From: Bernd Bausch [mailto:berndbau...@gmail.com]
Sent: Wednesday, November 29, 2017 3:03 PM
To: Ramu, MohanX <mohanx.r...@intel.com<mailto:mohanx.r...@intel.com>>; 
openstack@lists.openstack.org<mailto:openstack@lists.openstack.org>
Subject: RE: [Openstack] Devstack -Ocata - Failed to launch instance on Host



Things you could check:



- on the compute node, nova compute log (DevStack names it n-cpu), are there 
messages indicating that the node can't contact the placement service.



- if so, find out why.

The error message might contain clues. Likely causes are misconfiguration in 
nova.conf, some networking misconfiguration, firewall etc. Ultimately, the root 
of the problem is in local.conf, I would guess.



- if it can contact the placement service, something is wrong at a different 
level. Again, I would hope that the compute log contains useful information.



-----Original Message-----

From: Ramu, MohanX [mailto:mohanx.r...@intel.com]

Sent: Tuesday, November 28, 2017 4:47 PM

To: Brian Haley <haleyb....@gmail.com<mailto:haleyb....@gmail.com>>; 
openstack@lists.openstack.org<mailto:openstack@lists.openstack.org>

Subject: Re: [Openstack] Devstack -Ocata - Failed to launch instance on Host



After running discover host command also not able to populate the compute node 
data into Resource_providers table.



The cell mapping table ahs cello and Null not cell0, post updating cell1 also 
-nothing resolved.





su -s /bin/sh -c "nova-manage cell_v2 discover_hosts --verbose" nova 
nova-manage cell_v2 discover_hosts









-----Original Message-----

From: Brian Haley [mailto:haleyb....@gmail.com]

Sent: Monday, November 27, 2017 10:21 PM

To: Ramu, MohanX <mohanx.r...@intel.com<mailto:mohanx.r...@intel.com>>; 
openstack@lists.openstack.org<mailto:openstack@lists.openstack.org>

Subject: Re: [Openstack] Devstack -Ocata - Failed to launch instance on Host



On 11/27/2017 10:24 AM, Ramu, MohanX wrote:

> Hi All,

>

> Not bale to launch an instance on Compute Node. instead of it,

> instance is trying to launch on the same controller.

>

> The added compute node is not there in resource provider table.

>

> Could you please help us what we missed.

>

> Is placement API not configured on compute node side and Is there

> Placement API configurations coming by default along with basic

> Devstack set up?



There is a section on multinode setup in the devstack repo, 
doc/source/guides/multinode-lab.rst, that I think covers the issue you're 
seeing.  Assuming your compute node shows up in  'nova service-list' output, 
you need to run ./tools/discover_hosts.sh to have a cell mapping added for the 
node.



-Brian



_______________________________________________

Mailing list: http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack

Post to     : 
openstack@lists.openstack.org<mailto:openstack@lists.openstack.org>

Unsubscribe : http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack



_______________________________________________
Mailing list: http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack
Post to     : openstack@lists.openstack.org
Unsubscribe : http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack

Reply via email to