[Yahoo-eng-team] [Bug 1891973] [NEW] Getting HTTP 500 exception frequently in Openstack Ussuri

2020-08-18 Thread karthik
Public bug reported:

I am using openstack ussuri with Centos 8, installed using RDO packstack.
I frequently getting http 500 EXCEPTION ERROR when ever am doing some 
operation, i need to restart httpd service inorder to overcome this issue.

Unexpected API Error. Please report this at http://bugs.launchpad.net/nova/ and 
attach the Nova API log if possible.
 (HTTP 500) (Request-ID: 
req-8868aed7-265d-4e8e-aa37-d427b1b561cb)

: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve 
allocations for resource provider 3a80ea71-91a6-4da7-a05b-682b501fe6c9: 

2020-08-17 22:53:46.574 63567 ERROR nova.scheduler.client.report 
[req-fb01576e-e540-4930-a4f0-5109a0df28c4 - - - - -] [None] Failed to retrieve 
resource provider tree from placement API for UUID 
3a80ea71-91a6-4da7-a05b-682b501fe6c9. Got 500: 

500 Internal Server Error

Internal Server Error
The server encountered an internal error or
misconfiguration and was unable to complete
your request.
Please contact the server administrator at
 [no address given] to inform them of the time this error occurred,
 and the actions you performed just before this error.
More information about this error may be available
in the server error log.

.
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager 
[req-fb01576e-e540-4930-a4f0-5109a0df28c4 - - - - -] Error updating resources 
for node openstack-Karty.: nova.exception.ResourceProviderRetrievalFailed: 
Failed to get resource provider with UUID 3a80ea71-91a6-4da7-a05b-682b501fe6c9
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager Traceback (most recent 
call last):
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/nova/compute/manager.py", line 9685, in 
_update_available_resource_for_node
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager startup=startup)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/nova/compute/resource_tracker.py", line 842, 
in update_available_resource
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager 
self._update_available_resource(context, resources, startup=startup)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/oslo_concurrency/lockutils.py", line 359, in 
inner
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager return f(*args, 
**kwargs)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/nova/compute/resource_tracker.py", line 927, 
in _update_available_resource
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager 
self._update(context, cn, startup=startup)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/nova/compute/resource_tracker.py", line 1176, 
in _update
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager 
self._update_to_placement(context, compute_node, startup)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/retrying.py", line 68, in wrapped_f
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager return 
Retrying(*dargs, **dkw).call(f, *args, **kw)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/retrying.py", line 223, in call
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager return 
attempt.get(self._wrap_exception)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/retrying.py", line 261, in get
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager 
six.reraise(self.value[0], self.value[1], self.value[2])
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/six.py", line 703, in reraise
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager raise value
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/retrying.py", line 217, in call
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager attempt = 
Attempt(fn(*args, **kwargs), attempt_number, False)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/nova/compute/resource_tracker.py", line 1110, 
in _update_to_placement
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager context, 
compute_node.uuid, name=compute_node.hypervisor_hostname)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/nova/scheduler/client/report.py", line 857, 
in get_provider_tree_and_ensure_root
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager 
parent_provider_uuid=parent_provider_uuid)
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager   File 
"/usr/lib/python3.6/site-packages/nova/scheduler/client/report.py", line 640, 
in _ensure_resource_provider
2020-08-17 22:53:46.575 63567 ERROR nova.compute.manager rps_to_refresh = 

[Yahoo-eng-team] [Bug 1829881] [NEW] Swap of volumes with multiattach fails

2019-05-21 Thread Rajini Karthik
Public bug reported:

Swap test case of volume with multiattach fails

tempest.api.compute.admin.test_volume_swap.TestMultiAttachVolumeSwap.test_volume_swap_with_multiattach
[337.211730s] ... FAILED

Error log in nova
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [None req-858576cb-bdf3-49c1-b0af-34cb538712b0 
tempest-AttachVolumeMultiAttachTest-948792761 
tempest-AttachVolumeMultiAttachTest-948792761] [instance: 
2d08d9a4-6eab-4869-ba48-802c546c7eb7] Failed to detach volume 
f33ba2e1-b649-478c-af4c-dba062423ad2 from /dev/vdb: 
oslo_concurrency.processutils.ProcessExecutionError: Unexpected error while 
running command.
May 07 01:56:28.117664 dellscfc nova-compute[1036]: Command: /lib/udev/scsi_id 
--page 0x83 --whitelisted 
/dev/disk/by-path/pci-:04:00.0-fc-0x5000d3100101d530-lun-3
May 07 01:56:28.117664 dellscfc nova-compute[1036]: Exit code: 1
May 07 01:56:28.117664 dellscfc nova-compute[1036]: Stdout: ''
May 07 01:56:28.117664 dellscfc nova-compute[1036]: Stderr: ''
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
Traceback (most recent call last):
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/opt/stack/new/nova/nova/virt/block_device.py", line 326, in driver_detach
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
encryption=encryption)
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 1717, in detach_volume
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
encryption=encryption)
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/opt/stack/new/nova/nova/virt/libvirt/driver.py", line 1345, in 
_disconnect_volume
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
vol_driver.disconnect_volume(connection_info, instance)
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/opt/stack/new/nova/nova/virt/libvirt/volume/fibrechannel.py", line 72, in 
disconnect_volume
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
connection_info['data'])
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/usr/local/lib/python3.6/dist-packages/os_brick/utils.py", line 150, in 
trace_logging_wrapper
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
result = f(*args, **kwargs)
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/usr/local/lib/python3.6/dist-packages/oslo_concurrency/lockutils.py", line 
328, in inner
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
return f(*args, **kwargs)
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/usr/local/lib/python3.6/dist-packages/os_brick/initiator/connectors/fibre_channel.py",
 line 336, in disconnect_volume
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] wwn 
= self._linuxscsi.get_scsi_wwn(path)
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/usr/local/lib/python3.6/dist-packages/os_brick/initiator/linuxscsi.py", line 
163, in get_scsi_wwn
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
root_helper=self._root_helper)
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 
"/usr/local/lib/python3.6/dist-packages/os_brick/executor.py", line 52, in 
_execute
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7] 
result = self.__execute(*args, **kwargs)
May 07 01:56:28.117664 dellscfc nova-compute[1036]: ERROR 
nova.virt.block_device [instance: 2d08d9a4-6eab-4869-ba48-802c546c7eb7]   File 

[Yahoo-eng-team] [Bug 1533181] [NEW] Typo in network security group comment

2016-01-12 Thread Karthik
Public bug reported:

Typo in comment for fetching a list of security groups under
rest/api/network folder. Comment says it fetches an image list.

** Affects: horizon
 Importance: Undecided
 Assignee: Karthik (kali-karthik)
 Status: New

** Changed in: horizon
 Assignee: (unassigned) => Karthik (kali-karthik)

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Dashboard (Horizon).
https://bugs.launchpad.net/bugs/1533181

Title:
  Typo in network security group comment

Status in OpenStack Dashboard (Horizon):
  New

Bug description:
  Typo in comment for fetching a list of security groups under
  rest/api/network folder. Comment says it fetches an image list.

To manage notifications about this bug go to:
https://bugs.launchpad.net/horizon/+bug/1533181/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp


[Yahoo-eng-team] [Bug 1523863] [NEW] Tutorial for customising horizon

2015-12-08 Thread Karthik
Public bug reported:

The Step 1 present in the title of Branding Horizon may look better if
we rename title to foundation step.

** Affects: horizon
 Importance: Undecided
 Status: New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Dashboard (Horizon).
https://bugs.launchpad.net/bugs/1523863

Title:
  Tutorial for customising horizon

Status in OpenStack Dashboard (Horizon):
  New

Bug description:
  The Step 1 present in the title of Branding Horizon may look better if
  we rename title to foundation step.

To manage notifications about this bug go to:
https://bugs.launchpad.net/horizon/+bug/1523863/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp


[Yahoo-eng-team] [Bug 1488369] [NEW] Default panel not registered shown when unregister is called from customisation module

2015-08-25 Thread Karthik
Public bug reported:

Steps to reproduce the issue.

1) Create overrides.py(customisation module).
2) Call unregister for a default panel of any dashboard.
3) You get an exception saying Default Panel is not registered although 
default panel was registered.

** Affects: horizon
 Importance: Undecided
 Assignee: Karthik (kali-karthik)
 Status: New

** Changed in: horizon
 Assignee: (unassigned) = Karthik (kali-karthik)

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Dashboard (Horizon).
https://bugs.launchpad.net/bugs/1488369

Title:
  Default panel not registered shown when unregister is called from
  customisation module

Status in OpenStack Dashboard (Horizon):
  New

Bug description:
  Steps to reproduce the issue.

  1) Create overrides.py(customisation module).
  2) Call unregister for a default panel of any dashboard.
  3) You get an exception saying Default Panel is not registered although 
default panel was registered.

To manage notifications about this bug go to:
https://bugs.launchpad.net/horizon/+bug/1488369/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp


[Yahoo-eng-team] [Bug 1484076] [NEW] Confusing comment possible typo in network_create function

2015-08-12 Thread Karthik
Public bug reported:

In the def network_create(request, **kwargs): present in
horizon/openstack_dashboard/api/neutron.py  

 Create a subnet on a specified network.
:param request: request context
:param tenant_id: (optional) tenant id of the network created
:param name: (optional) name of the network created
:returns: Subnet object


This actually returns a network object the comment body needs to be
changed

** Affects: horizon
 Importance: Undecided
 Status: New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Dashboard (Horizon).
https://bugs.launchpad.net/bugs/1484076

Title:
  Confusing comment possible typo in network_create function

Status in OpenStack Dashboard (Horizon):
  New

Bug description:
  In the def network_create(request, **kwargs): present in
  horizon/openstack_dashboard/api/neutron.py  

   Create a subnet on a specified network.
  :param request: request context
  :param tenant_id: (optional) tenant id of the network created
  :param name: (optional) name of the network created
  :returns: Subnet object
  

  This actually returns a network object the comment body needs to be
  changed

To manage notifications about this bug go to:
https://bugs.launchpad.net/horizon/+bug/1484076/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp


[Yahoo-eng-team] [Bug 1470738] [NEW] PyCss conflicting combinators error.

2015-07-02 Thread Karthik
Public bug reported:

1) I cloned the horizon stable kilo branch
2) I followed the steps of 
http://docs.openstack.org/developer/horizon/quickstart.html
3) The tests ran succesfully
4) when I do a runserver and open the url I repeatedely get 
http://paste.openstack.org/show/334735/;
5) I noticed this change 
https://review.openstack.org/#/c/178504/10/requirements.txt and upgraded the 
highligted packages also still i face the same issue.

** Affects: horizon
 Importance: Undecided
 Status: New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Dashboard (Horizon).
https://bugs.launchpad.net/bugs/1470738

Title:
  PyCss  conflicting combinators error.

Status in OpenStack Dashboard (Horizon):
  New

Bug description:
  1) I cloned the horizon stable kilo branch
  2) I followed the steps of 
http://docs.openstack.org/developer/horizon/quickstart.html
  3) The tests ran succesfully
  4) when I do a runserver and open the url I repeatedely get 
http://paste.openstack.org/show/334735/;
  5) I noticed this change 
https://review.openstack.org/#/c/178504/10/requirements.txt and upgraded the 
highligted packages also still i face the same issue.

To manage notifications about this bug go to:
https://bugs.launchpad.net/horizon/+bug/1470738/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp


[Yahoo-eng-team] [Bug 1339884] [NEW] Variable initialization missing in db_base_plugin_v2 class

2014-07-09 Thread Karthik Natarajan
Public bug reported:

In the update_port method of class neutron/db/db_base_plugin_v2.py, the
variable changed_device_id is not initialized, but it is getting
tested in the if condition which causes python error.

** Affects: neutron
 Importance: Undecided
 Assignee: Karthik Natarajan (natarajk)
 Status: In Progress

** Changed in: neutron
 Assignee: (unassigned) = Karthik Natarajan (natarajk)

** Changed in: neutron
   Status: New = In Progress

** Description changed:

- In the update_port method of neutron/db/db_base_plugin_v2.py, the
+ In the update_port method of class neutron/db/db_base_plugin_v2.py, the
  variable changed_device_id is not initialized, but it is getting
  tested in the if condition which causes python error.

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to neutron.
https://bugs.launchpad.net/bugs/1339884

Title:
  Variable initialization missing in db_base_plugin_v2 class

Status in OpenStack Neutron (virtual network service):
  In Progress

Bug description:
  In the update_port method of class neutron/db/db_base_plugin_v2.py,
  the variable changed_device_id is not initialized, but it is getting
  tested in the if condition which causes python error.

To manage notifications about this bug go to:
https://bugs.launchpad.net/neutron/+bug/1339884/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp