Public bug reported:

We are noticing recurring failures in the gate under CentOS and OpenSUSE across 
Devstack and Packstack jobs on the master branches.
This has been happening only on the OVH cloud regions as far as we know.

Example failures:
- CentOS Devstack: 
http://logs.openstack.org/46/523646/1/check/legacy-tempest-dsvm-neutron-full-centos-7/5bf092c/job-output.txt#_2017-11-29_03_02_38_031560
- OpenSUSE Devstack: 
http://logs.openstack.org/23/522423/7/check/legacy-tempest-dsvm-neutron-full-opensuse-423/b6768d7/job-output.txt#_2017-11-27_21_53_13_340319
- CentOS Packstack: 
http://logs.openstack.org/14/516714/1/check/packstack-integration-scenario002-tempest/7ba8d06/job-output.txt.gz#_2017-10-31_17_55_39_845816

They all fail with the same stack trace:
=====
setUpClass (tempest.api.compute.servers.test_create_server.ServersTestJSON)
---------------------------------------------------------------------------

Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "tempest/test.py", line 172, in setUpClass
        six.reraise(etype, value, trace)
      File "tempest/test.py", line 165, in setUpClass
        cls.resource_setup()
      File "tempest/api/compute/servers/test_create_server.py", line 64, in 
resource_setup
        volume_backed=cls.volume_backed)
      File "tempest/api/compute/base.py", line 190, in create_test_server
        **kwargs)
      File "tempest/common/compute.py", line 258, in create_test_server
        server['id'])
      File 
"/opt/stack/new/tempest/.tox/tempest/lib/python2.7/site-packages/oslo_utils/excutils.py",
 line 220, in __exit__
        self.force_reraise()
      File 
"/opt/stack/new/tempest/.tox/tempest/lib/python2.7/site-packages/oslo_utils/excutils.py",
 line 196, in force_reraise
        six.reraise(self.type_, self.value, self.tb)
      File "tempest/common/compute.py", line 229, in create_test_server
        clients.servers_client, server['id'], wait_until)
      File "tempest/common/waiters.py", line 96, in wait_for_server_status
        raise lib_exc.TimeoutException(message)
    tempest.lib.exceptions.TimeoutException: Request timed out
    Details: (ServersTestJSON:setUpClass) Server 
2f8de011-b218-4e73-b9e3-e7fcf9e9278b failed to reach ACTIVE status and task 
state "None" within the required time (196 s). Current status: BUILD. Current 
task state: spawning.
=====

** Affects: nova
     Importance: Undecided
         Status: New

** Summary changed:

- Tempest makes Nova hang when creating a VM with file injection
+ Nova can hang when creating a VM with file injection

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1735823

Title:
  Nova can hang when creating a VM with file injection

Status in OpenStack Compute (nova):
  New

Bug description:
  We are noticing recurring failures in the gate under CentOS and OpenSUSE 
across Devstack and Packstack jobs on the master branches.
  This has been happening only on the OVH cloud regions as far as we know.

  Example failures:
  - CentOS Devstack: 
http://logs.openstack.org/46/523646/1/check/legacy-tempest-dsvm-neutron-full-centos-7/5bf092c/job-output.txt#_2017-11-29_03_02_38_031560
  - OpenSUSE Devstack: 
http://logs.openstack.org/23/522423/7/check/legacy-tempest-dsvm-neutron-full-opensuse-423/b6768d7/job-output.txt#_2017-11-27_21_53_13_340319
  - CentOS Packstack: 
http://logs.openstack.org/14/516714/1/check/packstack-integration-scenario002-tempest/7ba8d06/job-output.txt.gz#_2017-10-31_17_55_39_845816

  They all fail with the same stack trace:
  =====
  setUpClass (tempest.api.compute.servers.test_create_server.ServersTestJSON)
  ---------------------------------------------------------------------------

  Captured traceback:
  ~~~~~~~~~~~~~~~~~~~
      Traceback (most recent call last):
        File "tempest/test.py", line 172, in setUpClass
          six.reraise(etype, value, trace)
        File "tempest/test.py", line 165, in setUpClass
          cls.resource_setup()
        File "tempest/api/compute/servers/test_create_server.py", line 64, in 
resource_setup
          volume_backed=cls.volume_backed)
        File "tempest/api/compute/base.py", line 190, in create_test_server
          **kwargs)
        File "tempest/common/compute.py", line 258, in create_test_server
          server['id'])
        File 
"/opt/stack/new/tempest/.tox/tempest/lib/python2.7/site-packages/oslo_utils/excutils.py",
 line 220, in __exit__
          self.force_reraise()
        File 
"/opt/stack/new/tempest/.tox/tempest/lib/python2.7/site-packages/oslo_utils/excutils.py",
 line 196, in force_reraise
          six.reraise(self.type_, self.value, self.tb)
        File "tempest/common/compute.py", line 229, in create_test_server
          clients.servers_client, server['id'], wait_until)
        File "tempest/common/waiters.py", line 96, in wait_for_server_status
          raise lib_exc.TimeoutException(message)
      tempest.lib.exceptions.TimeoutException: Request timed out
      Details: (ServersTestJSON:setUpClass) Server 
2f8de011-b218-4e73-b9e3-e7fcf9e9278b failed to reach ACTIVE status and task 
state "None" within the required time (196 s). Current status: BUILD. Current 
task state: spawning.
  =====

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1735823/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to     : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp

Reply via email to