This bug was fixed in the package python-oslo.messaging - 5.10.0-0ubuntu2~cloud0 ---------------
python-oslo.messaging (5.10.0-0ubuntu2~cloud0) xenial-newton; urgency=medium . * New update for the Ubuntu Cloud Archive. . python-oslo.messaging (5.10.0-0ubuntu2) yakkety; urgency=medium . * d/p/rabbit-avoid-busy-loop.patch: Cherry pick patch from upstream to avoid rabbit driver busy loop on epoll_wait with heartbeat+eventlet (LP: #1518430). ** Changed in: cloud-archive/newton Status: Fix Committed => Fix Released -- You received this bug notification because you are a member of नेपाली भाषा समायोजकहरुको समूह, which is subscribed to Xenial. Matching subscriptions: Ubuntu 16.04 Bugs https://bugs.launchpad.net/bugs/1518430 Title: liberty: ~busy loop on epoll_wait being called with zero timeout Status in Ubuntu Cloud Archive: Fix Committed Status in Ubuntu Cloud Archive kilo series: Fix Committed Status in Ubuntu Cloud Archive liberty series: Fix Released Status in Ubuntu Cloud Archive mitaka series: Fix Committed Status in Ubuntu Cloud Archive newton series: Fix Released Status in oslo.messaging: Fix Released Status in python-oslo.messaging package in Ubuntu: Fix Released Status in python-oslo.messaging source package in Xenial: Fix Released Status in python-oslo.messaging source package in Yakkety: Fix Released Status in python-oslo.messaging source package in Zesty: Fix Released Bug description: Context: openstack juju/maas deploy using 1510 charms release on trusty, with: openstack-origin: "cloud:trusty-liberty" source: "cloud:trusty-updates/liberty * Several openstack nova- and neutron- services, at least: nova-compute, neutron-server, nova-conductor, neutron-openvswitch-agent,neutron-vpn-agent show almost busy looping on epoll_wait() calls, with zero timeout set most frequently. - nova-compute (chose it b/cos single proc'd) strace and ltrace captures: http://paste.ubuntu.com/13371248/ (ltrace, strace) As comparison, this is how it looks on a kilo deploy: - http://paste.ubuntu.com/13371635/ * 'top' sample from a nova-cloud-controller unit from this completely idle stack: http://paste.ubuntu.com/13371809/ FYI *not* seeing this behavior on keystone, glance, cinder, ceilometer-api. As this issue is present on several components, it likely comes from common libraries (oslo concurrency?), fyi filed the bug to nova itself as a starting point for debugging. Note: The description in the following bug gives a good overview of the issue and points to a possible fix for oslo.messaging: https://bugs.launchpad.net/mos/+bug/1380220 To manage notifications about this bug go to: https://bugs.launchpad.net/cloud-archive/+bug/1518430/+subscriptions _______________________________________________ Mailing list: https://launchpad.net/~group.of.nepali.translators Post to : group.of.nepali.translators@lists.launchpad.net Unsubscribe : https://launchpad.net/~group.of.nepali.translators More help : https://help.launchpad.net/ListHelp