Hi list,

in one datacenter I'm facing problems with my export storage. The dc is
of type single host with local storage. On the host I see that the nfs
export domain is still connected, but the engine does not show this and
therefore it cannot be used for exports or detached.

Trying to add attach the export domain again fails. The following is
logged n vdsm:

Thread-1902159::ERROR::2013-01-24
17:11:45,474::task::853::TaskManager.Task::(_setError)
Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::Unexpected error
Traceback (most recent call last):
  File "/usr/share/vdsm/storage/task.py", line 861, in _run
    return fn(*args, **kargs)
  File "/usr/share/vdsm/logUtils.py", line 38, in wrapper
    res = f(*args, **kwargs)
  File "/usr/share/vdsm/storage/hsm.py", line 960, in attachStorageDomain
    pool.attachSD(sdUUID)
  File "/usr/share/vdsm/storage/securable.py", line 63, in wrapper
    return f(self, *args, **kwargs)
  File "/usr/share/vdsm/storage/sp.py", line 924, in attachSD
    dom.attach(self.spUUID)
  File "/usr/share/vdsm/storage/sd.py", line 442, in attach
    raise se.StorageDomainAlreadyAttached(pools[0], self.sdUUID)
StorageDomainAlreadyAttached: Storage domain already attached to pool:
'domain=cd23808b-136a-4b33-a80c-f2581eab022d,
pool=d95c53ca-9cef-4db2-8858-bf4937bd8c14'

It won't let me attach the export domain saying that it is already
attached. Manually umounting the export domain on the host results in
the same error on subsequent attach.

This is on CentOS 6.3 using Dreyou's rpms. Installed versions on host:

vdsm.x86_64                                 4.10.0-0.44.14.el6
vdsm-cli.noarch                             4.10.0-0.44.14.el6
vdsm-python.x86_64                          4.10.0-0.44.14.el6
vdsm-xmlrpc.noarch                          4.10.0-0.44.14.el6

Engine:

ovirt-engine.noarch                         3.1.0-3.19.el6
ovirt-engine-backend.noarch                 3.1.0-3.19.el6
ovirt-engine-cli.noarch                     3.1.0.7-1.el6
ovirt-engine-config.noarch                  3.1.0-3.19.el6
ovirt-engine-dbscripts.noarch               3.1.0-3.19.el6
ovirt-engine-genericapi.noarch              3.1.0-3.19.el6
ovirt-engine-jbossas711.x86_64              1-0
ovirt-engine-notification-service.noarch    3.1.0-3.19.el6
ovirt-engine-restapi.noarch                 3.1.0-3.19.el6
ovirt-engine-sdk.noarch                     3.1.0.5-1.el6
ovirt-engine-setup.noarch                   3.1.0-3.19.el6
ovirt-engine-tools-common.noarch            3.1.0-3.19.el6
ovirt-engine-userportal.noarch              3.1.0-3.19.el6
ovirt-engine-webadmin-portal.noarch         3.1.0-3.19.el6
ovirt-image-uploader.noarch                 3.1.0-16.el6
ovirt-iso-uploader.noarch                   3.1.0-16.el6
ovirt-log-collector.noarch                  3.1.0-16.el6

How can this be recovered to a sane state? If more information is
needed, please do not hesitate to request it.

Thanks and regards
Patrick

-- 
Lobster LOGsuite GmbH, Münchner Straße 15a, D-82319 Starnberg

HRB 178831, Amtsgericht München
Geschäftsführer: Dr. Martin Fischer, Rolf Henrich
Thread-1902157::DEBUG::2013-01-24 17:11:36,039::BindingXMLRPC::156::vds::(wrapper) [xxx.xxx.xxx.190]
Thread-1902157::DEBUG::2013-01-24 17:11:36,039::task::588::TaskManager.Task::(_updateState) Task=`a686738c-ff6f-43ad-8966-8ec158dc7c2e`::moving from state init -> state preparing
Thread-1902157::INFO::2013-01-24 17:11:36,039::logUtils::37::dispatcher::(wrapper) Run and protect: validateStorageServerConnection(domType=1, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': 'xxx.xxx.xxx.191:/data/ovirt-export-fra', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '5207d7da-2655-4843-b126-3252e38beafa', 'port': ''}], options=None)
Thread-1902157::INFO::2013-01-24 17:11:36,039::logUtils::39::dispatcher::(wrapper) Run and protect: validateStorageServerConnection, Return response: {'statuslist': [{'status': 0, 'id': '5207d7da-2655-4843-b126-3252e38beafa'}]}
Thread-1902157::DEBUG::2013-01-24 17:11:36,039::task::1172::TaskManager.Task::(prepare) Task=`a686738c-ff6f-43ad-8966-8ec158dc7c2e`::finished: {'statuslist': [{'status': 0, 'id': '5207d7da-2655-4843-b126-3252e38beafa'}]}
Thread-1902157::DEBUG::2013-01-24 17:11:36,040::task::588::TaskManager.Task::(_updateState) Task=`a686738c-ff6f-43ad-8966-8ec158dc7c2e`::moving from state preparing -> state finished
Thread-1902157::DEBUG::2013-01-24 17:11:36,040::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-1902157::DEBUG::2013-01-24 17:11:36,040::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-1902157::DEBUG::2013-01-24 17:11:36,040::task::978::TaskManager.Task::(_decref) Task=`a686738c-ff6f-43ad-8966-8ec158dc7c2e`::ref 0 aborting False
Thread-1902158::DEBUG::2013-01-24 17:11:36,057::BindingXMLRPC::156::vds::(wrapper) [xxx.xxx.xxx.190]
Thread-1902158::DEBUG::2013-01-24 17:11:36,057::task::588::TaskManager.Task::(_updateState) Task=`f1627a6f-3d00-4f55-8baa-8da1f8e56c63`::moving from state init -> state preparing
Thread-1902158::INFO::2013-01-24 17:11:36,057::logUtils::37::dispatcher::(wrapper) Run and protect: connectStorageServer(domType=1, spUUID='00000000-0000-0000-0000-000000000000', conList=[{'connection': 'xxx.xxx.xxx.191:/data/ovirt-export-fra', 'iqn': '', 'portal': '', 'user': '', 'password': '******', 'id': '5207d7da-2655-4843-b126-3252e38beafa', 'port': ''}], options=None)
Thread-1902158::DEBUG::2013-01-24 17:11:36,060::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n /bin/mount -t nfs -o soft,nosharecache,timeo=600,retrans=6 xxx.xxx.xxx.191:/data/ovirt-export-fra /rhev/data-center/mnt/xxx.xxx.xxx.191:_data_ovirt-export-fra' (cwd None)
Thread-1902158::DEBUG::2013-01-24 17:11:36,124::lvm::457::OperationMutex::(_invalidateAllPvs) Operation 'lvm invalidate operation' got the operation mutex
Thread-1902158::DEBUG::2013-01-24 17:11:36,124::lvm::459::OperationMutex::(_invalidateAllPvs) Operation 'lvm invalidate operation' released the operation mutex
Thread-1902158::DEBUG::2013-01-24 17:11:36,124::lvm::469::OperationMutex::(_invalidateAllVgs) Operation 'lvm invalidate operation' got the operation mutex
Thread-1902158::DEBUG::2013-01-24 17:11:36,124::lvm::471::OperationMutex::(_invalidateAllVgs) Operation 'lvm invalidate operation' released the operation mutex
Thread-1902158::DEBUG::2013-01-24 17:11:36,124::lvm::490::OperationMutex::(_invalidateAllLvs) Operation 'lvm invalidate operation' got the operation mutex
Thread-1902158::DEBUG::2013-01-24 17:11:36,125::lvm::492::OperationMutex::(_invalidateAllLvs) Operation 'lvm invalidate operation' released the operation mutex
Thread-1902158::INFO::2013-01-24 17:11:36,125::logUtils::39::dispatcher::(wrapper) Run and protect: connectStorageServer, Return response: {'statuslist': [{'status': 0, 'id': '5207d7da-2655-4843-b126-3252e38beafa'}]}
Thread-1902158::DEBUG::2013-01-24 17:11:36,125::task::1172::TaskManager.Task::(prepare) Task=`f1627a6f-3d00-4f55-8baa-8da1f8e56c63`::finished: {'statuslist': [{'status': 0, 'id': '5207d7da-2655-4843-b126-3252e38beafa'}]}
Thread-1902158::DEBUG::2013-01-24 17:11:36,125::task::588::TaskManager.Task::(_updateState) Task=`f1627a6f-3d00-4f55-8baa-8da1f8e56c63`::moving from state preparing -> state finished
Thread-1902158::DEBUG::2013-01-24 17:11:36,125::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-1902158::DEBUG::2013-01-24 17:11:36,125::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-1902158::DEBUG::2013-01-24 17:11:36,125::task::978::TaskManager.Task::(_decref) Task=`f1627a6f-3d00-4f55-8baa-8da1f8e56c63`::ref 0 aborting False
Thread-1902159::DEBUG::2013-01-24 17:11:36,136::BindingXMLRPC::156::vds::(wrapper) [xxx.xxx.xxx.190]
Thread-1902159::DEBUG::2013-01-24 17:11:36,136::task::588::TaskManager.Task::(_updateState) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::moving from state init -> state preparing
Thread-1902159::INFO::2013-01-24 17:11:36,136::logUtils::37::dispatcher::(wrapper) Run and protect: attachStorageDomain(sdUUID='cd23808b-136a-4b33-a80c-f2581eab022d', spUUID='5874cb6d-080b-4573-97d2-c926dae1fbc9', options=None)
Thread-1902159::DEBUG::2013-01-24 17:11:36,136::resourceManager::175::ResourceManager.Request::(__init__) ResName=`Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9`ReqID=`c734d609-aa60-452c-b361-908ec5ab4e8b`::Request was made in '/usr/share/vdsm/storage/resourceManager.py' line '485' at 'registerResource'
Thread-1902159::DEBUG::2013-01-24 17:11:36,136::resourceManager::486::ResourceManager::(registerResource) Trying to register resource 'Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9' for lock type 'exclusive'
Thread-1902159::DEBUG::2013-01-24 17:11:36,137::resourceManager::528::ResourceManager::(registerResource) Resource 'Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9' is free. Now locking as 'exclusive' (1 active user)
Thread-1902159::DEBUG::2013-01-24 17:11:36,137::resourceManager::212::ResourceManager.Request::(grant) ResName=`Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9`ReqID=`c734d609-aa60-452c-b361-908ec5ab4e8b`::Granted request
Thread-1902159::DEBUG::2013-01-24 17:11:36,137::task::817::TaskManager.Task::(resourceAcquired) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::_resourcesAcquired: Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9 (exclusive)
Thread-1902159::DEBUG::2013-01-24 17:11:36,137::task::978::TaskManager.Task::(_decref) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::ref 1 aborting False
Thread-1902159::DEBUG::2013-01-24 17:11:36,137::resourceManager::175::ResourceManager.Request::(__init__) ResName=`Storage.cd23808b-136a-4b33-a80c-f2581eab022d`ReqID=`7fb3bd56-84b9-49ff-a97b-e4f77ced85b8`::Request was made in '/usr/share/vdsm/storage/resourceManager.py' line '485' at 'registerResource'
Thread-1902159::DEBUG::2013-01-24 17:11:36,137::resourceManager::486::ResourceManager::(registerResource) Trying to register resource 'Storage.cd23808b-136a-4b33-a80c-f2581eab022d' for lock type 'exclusive'
Thread-1902159::DEBUG::2013-01-24 17:11:36,138::resourceManager::528::ResourceManager::(registerResource) Resource 'Storage.cd23808b-136a-4b33-a80c-f2581eab022d' is free. Now locking as 'exclusive' (1 active user)
Thread-1902159::DEBUG::2013-01-24 17:11:36,138::resourceManager::212::ResourceManager.Request::(grant) ResName=`Storage.cd23808b-136a-4b33-a80c-f2581eab022d`ReqID=`7fb3bd56-84b9-49ff-a97b-e4f77ced85b8`::Granted request
Thread-1902159::DEBUG::2013-01-24 17:11:36,138::task::817::TaskManager.Task::(resourceAcquired) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::_resourcesAcquired: Storage.cd23808b-136a-4b33-a80c-f2581eab022d (exclusive)
Thread-1902159::DEBUG::2013-01-24 17:11:36,138::task::978::TaskManager.Task::(_decref) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::ref 1 aborting False
Thread-1902159::INFO::2013-01-24 17:11:36,138::sp::909::Storage.StoragePool::(attachSD) sdUUID=cd23808b-136a-4b33-a80c-f2581eab022d spUUID=5874cb6d-080b-4573-97d2-c926dae1fbc9
Thread-1902159::DEBUG::2013-01-24 17:11:36,138::misc::1053::SamplingMethod::(__call__) Trying to enter sampling method (storage.sdc.refreshStorage)
Thread-1902159::DEBUG::2013-01-24 17:11:36,138::misc::1055::SamplingMethod::(__call__) Got in to sampling method
Thread-1902159::DEBUG::2013-01-24 17:11:36,139::misc::1053::SamplingMethod::(__call__) Trying to enter sampling method (storage.iscsi.rescan)
Thread-1902159::DEBUG::2013-01-24 17:11:36,139::misc::1055::SamplingMethod::(__call__) Got in to sampling method
Thread-1902159::DEBUG::2013-01-24 17:11:36,139::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n /sbin/iscsiadm -m session -R' (cwd None)
Thread-1902159::DEBUG::2013-01-24 17:11:36,151::__init__::1164::Storage.Misc.excCmd::(_log) FAILED: <err> = 'iscsiadm: No session found.\n'; <rc> = 21
Thread-1902159::DEBUG::2013-01-24 17:11:36,151::misc::1063::SamplingMethod::(__call__) Returning last result
Thread-1902159::DEBUG::2013-01-24 17:11:42,204::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n /sbin/multipath' (cwd None)
Thread-1902159::DEBUG::2013-01-24 17:11:42,252::__init__::1164::Storage.Misc.excCmd::(_log) SUCCESS: <err> = ''; <rc> = 0
Thread-1902159::DEBUG::2013-01-24 17:11:42,252::lvm::457::OperationMutex::(_invalidateAllPvs) Operation 'lvm invalidate operation' got the operation mutex
Thread-1902159::DEBUG::2013-01-24 17:11:42,253::lvm::459::OperationMutex::(_invalidateAllPvs) Operation 'lvm invalidate operation' released the operation mutex
Thread-1902159::DEBUG::2013-01-24 17:11:42,253::lvm::469::OperationMutex::(_invalidateAllVgs) Operation 'lvm invalidate operation' got the operation mutex
Thread-1902159::DEBUG::2013-01-24 17:11:42,253::lvm::471::OperationMutex::(_invalidateAllVgs) Operation 'lvm invalidate operation' released the operation mutex
Thread-1902159::DEBUG::2013-01-24 17:11:42,253::lvm::490::OperationMutex::(_invalidateAllLvs) Operation 'lvm invalidate operation' got the operation mutex
Thread-1902159::DEBUG::2013-01-24 17:11:42,253::lvm::492::OperationMutex::(_invalidateAllLvs) Operation 'lvm invalidate operation' released the operation mutex
Thread-1902159::DEBUG::2013-01-24 17:11:42,253::misc::1063::SamplingMethod::(__call__) Returning last result
Thread-1902159::DEBUG::2013-01-24 17:11:42,253::lvm::349::OperationMutex::(_reloadvgs) Operation 'lvm reload operation' got the operation mutex
Thread-1902159::DEBUG::2013-01-24 17:11:42,255::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n /sbin/lvm vgs --config " devices { preferred_names = [\\"^/dev/mapper/\\"] ignore_suspended_devices=1 write_cache_state=0 disable_after_error_count=3 filter = [ \\"r%.*%\\" ] }  global {  locking_type=1  prioritise_write_locks=1  wait_for_locks=1 }  backup {  retain_min = 50  retain_days = 0 } " --noheadings --units b --nosuffix --separator | -o uuid,name,attr,size,free,extent_size,extent_count,free_count,tags,vg_mda_size,vg_mda_free cd23808b-136a-4b33-a80c-f2581eab022d' (cwd None)
Thread-1902159::DEBUG::2013-01-24 17:11:42,398::__init__::1164::Storage.Misc.excCmd::(_log) FAILED: <err> = '  Volume group "cd23808b-136a-4b33-a80c-f2581eab022d" not found\n'; <rc> = 5
Thread-1902159::WARNING::2013-01-24 17:11:42,399::lvm::353::Storage.LVM::(_reloadvgs) lvm vgs failed: 5 [] ['  Volume group "cd23808b-136a-4b33-a80c-f2581eab022d" not found']
Thread-1902159::DEBUG::2013-01-24 17:11:42,399::lvm::376::OperationMutex::(_reloadvgs) Operation 'lvm reload operation' released the operation mutex
Thread-1902159::DEBUG::2013-01-24 17:11:42,404::fileSD::107::Storage.StorageDomain::(__init__) Reading domain in path /rhev/data-center/mnt/xxx.xxx.xxx.191:_data_ovirt-export-fra/cd23808b-136a-4b33-a80c-f2581eab022d
Thread-1902159::DEBUG::2013-01-24 17:11:42,405::persistentDict::185::Storage.PersistentDict::(__init__) Created a persistent dict with FileMetadataRW backend
Thread-1902159::DEBUG::2013-01-24 17:11:42,409::persistentDict::226::Storage.PersistentDict::(refresh) read lines (FileMetadataRW)=['CLASS=Backup', 'DESCRIPTION=export-fra', 'IOOPTIMEOUTSEC=1', 'LEASERETRIES=3', 'LEASETIMESEC=5', 'LOCKPOLICY=', 'LOCKRENEWALINTERVALSEC=5', 'MASTER_VERSION=0', 'POOL_UUID=d95c53ca-9cef-4db2-8858-bf4937bd8c14', 'REMOTE_PATH=xxx.xxx.xxx.191:/data/ovirt-export-fra', 'ROLE=Regular', 'SDUUID=cd23808b-136a-4b33-a80c-f2581eab022d', 'TYPE=NFS', 'VERSION=0', '_SHA_CKSUM=dcc7846e2b8ba182cc07536ddc70e8234674faaf']
Thread-1902159::DEBUG::2013-01-24 17:11:42,411::fileSD::420::Storage.StorageDomain::(imageGarbageCollector) Removing remnants of deleted images []
Thread-1902159::WARNING::2013-01-24 17:11:42,411::sd::312::Storage.StorageDomain::(_registerResourceNamespaces) Resource namespace cd23808b-136a-4b33-a80c-f2581eab022d_imageNS already registered
Thread-1902159::WARNING::2013-01-24 17:11:42,411::sd::318::Storage.StorageDomain::(_registerResourceNamespaces) Resource namespace cd23808b-136a-4b33-a80c-f2581eab022d_volumeNS already registered
Thread-1902159::DEBUG::2013-01-24 17:11:42,411::safelease::85::ClusterLock::(acquire) Acquiring cluster lock for domain cd23808b-136a-4b33-a80c-f2581eab022d
Thread-1902159::DEBUG::2013-01-24 17:11:42,412::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/bin/sudo -n /usr/bin/setsid /usr/bin/ionice -c1 -n0 /bin/su vdsm -s /bin/sh -c "/usr/libexec/vdsm/spmprotect.sh start cd23808b-136a-4b33-a80c-f2581eab022d 1 5 /rhev/data-center/mnt/xxx.xxx.xxx.191:_data_ovirt-export-fra/cd23808b-136a-4b33-a80c-f2581eab022d/dom_md/leases 5000 1000 3"' (cwd /usr/libexec/vdsm)
Thread-1902159::DEBUG::2013-01-24 17:11:44,443::__init__::1164::Storage.Misc.excCmd::(_log) SUCCESS: <err> = ''; <rc> = 0
Thread-1902159::DEBUG::2013-01-24 17:11:44,444::safelease::100::ClusterLock::(acquire) Clustered lock acquired successfully
Thread-1902159::DEBUG::2013-01-24 17:11:44,446::persistentDict::226::Storage.PersistentDict::(refresh) read lines (FileMetadataRW)=['CLASS=Backup', 'DESCRIPTION=export-fra', 'IOOPTIMEOUTSEC=1', 'LEASERETRIES=3', 'LEASETIMESEC=5', 'LOCKPOLICY=', 'LOCKRENEWALINTERVALSEC=5', 'MASTER_VERSION=0', 'POOL_UUID=d95c53ca-9cef-4db2-8858-bf4937bd8c14', 'REMOTE_PATH=xxx.xxx.xxx.191:/data/ovirt-export-fra', 'ROLE=Regular', 'SDUUID=cd23808b-136a-4b33-a80c-f2581eab022d', 'TYPE=NFS', 'VERSION=0', '_SHA_CKSUM=dcc7846e2b8ba182cc07536ddc70e8234674faaf']
Thread-1902159::INFO::2013-01-24 17:11:44,447::safelease::110::ClusterLock::(release) Releasing cluster lock for domain cd23808b-136a-4b33-a80c-f2581eab022d
Thread-1902159::DEBUG::2013-01-24 17:11:44,447::__init__::1164::Storage.Misc.excCmd::(_log) '/usr/libexec/vdsm/spmstop.sh cd23808b-136a-4b33-a80c-f2581eab022d' (cwd /usr/libexec/vdsm)
Thread-1902159::DEBUG::2013-01-24 17:11:45,473::__init__::1164::Storage.Misc.excCmd::(_log) SUCCESS: <err> = ''; <rc> = 0
Thread-1902159::DEBUG::2013-01-24 17:11:45,474::safelease::117::ClusterLock::(release) Cluster lock released successfully
Thread-1902159::ERROR::2013-01-24 17:11:45,474::task::853::TaskManager.Task::(_setError) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::Unexpected error
Traceback (most recent call last):
  File "/usr/share/vdsm/storage/task.py", line 861, in _run
    return fn(*args, **kargs)
  File "/usr/share/vdsm/logUtils.py", line 38, in wrapper
    res = f(*args, **kwargs)
  File "/usr/share/vdsm/storage/hsm.py", line 960, in attachStorageDomain
    pool.attachSD(sdUUID)
  File "/usr/share/vdsm/storage/securable.py", line 63, in wrapper
    return f(self, *args, **kwargs)
  File "/usr/share/vdsm/storage/sp.py", line 924, in attachSD
    dom.attach(self.spUUID)
  File "/usr/share/vdsm/storage/sd.py", line 442, in attach
    raise se.StorageDomainAlreadyAttached(pools[0], self.sdUUID)
StorageDomainAlreadyAttached: Storage domain already attached to pool: 'domain=cd23808b-136a-4b33-a80c-f2581eab022d, pool=d95c53ca-9cef-4db2-8858-bf4937bd8c14'
Thread-1902159::DEBUG::2013-01-24 17:11:45,474::task::872::TaskManager.Task::(_run) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::Task._run: 4bc15024-7917-4599-988f-2784ce43fbe7 ('cd23808b-136a-4b33-a80c-f2581eab022d', '5874cb6d-080b-4573-97d2-c926dae1fbc9') {} failed - stopping task
Thread-1902159::DEBUG::2013-01-24 17:11:45,474::task::1199::TaskManager.Task::(stop) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::stopping in state preparing (force False)
Thread-1902159::DEBUG::2013-01-24 17:11:45,474::task::978::TaskManager.Task::(_decref) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::ref 1 aborting True
Thread-1902159::INFO::2013-01-24 17:11:45,475::task::1157::TaskManager.Task::(prepare) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::aborting: Task is aborted: 'Storage domain already attached to pool' - code 380
Thread-1902159::DEBUG::2013-01-24 17:11:45,475::task::1162::TaskManager.Task::(prepare) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::Prepare: aborted: Storage domain already attached to pool
Thread-1902159::DEBUG::2013-01-24 17:11:45,475::task::978::TaskManager.Task::(_decref) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::ref 0 aborting True
Thread-1902159::DEBUG::2013-01-24 17:11:45,475::task::913::TaskManager.Task::(_doAbort) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::Task._doAbort: force False
Thread-1902159::DEBUG::2013-01-24 17:11:45,475::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-1902159::DEBUG::2013-01-24 17:11:45,475::task::588::TaskManager.Task::(_updateState) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::moving from state preparing -> state aborting
Thread-1902159::DEBUG::2013-01-24 17:11:45,475::task::537::TaskManager.Task::(__state_aborting) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::_aborting: recover policy none
Thread-1902159::DEBUG::2013-01-24 17:11:45,475::task::588::TaskManager.Task::(_updateState) Task=`4bc15024-7917-4599-988f-2784ce43fbe7`::moving from state aborting -> state failed
Thread-1902159::DEBUG::2013-01-24 17:11:45,475::resourceManager::809::ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {'Storage.cd23808b-136a-4b33-a80c-f2581eab022d': < ResourceRef 'Storage.cd23808b-136a-4b33-a80c-f2581eab022d', isValid: 'True' obj: 'None'>, 'Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9': < ResourceRef 'Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9', isValid: 'True' obj: 'None'>}
Thread-1902159::DEBUG::2013-01-24 17:11:45,476::resourceManager::844::ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-1902159::DEBUG::2013-01-24 17:11:45,476::resourceManager::538::ResourceManager::(releaseResource) Trying to release resource 'Storage.cd23808b-136a-4b33-a80c-f2581eab022d'
Thread-1902159::DEBUG::2013-01-24 17:11:45,476::resourceManager::553::ResourceManager::(releaseResource) Released resource 'Storage.cd23808b-136a-4b33-a80c-f2581eab022d' (0 active users)
Thread-1902159::DEBUG::2013-01-24 17:11:45,476::resourceManager::558::ResourceManager::(releaseResource) Resource 'Storage.cd23808b-136a-4b33-a80c-f2581eab022d' is free, finding out if anyone is waiting for it.
Thread-1902159::DEBUG::2013-01-24 17:11:45,476::resourceManager::565::ResourceManager::(releaseResource) No one is waiting for resource 'Storage.cd23808b-136a-4b33-a80c-f2581eab022d', Clearing records.
Thread-1902159::DEBUG::2013-01-24 17:11:45,476::resourceManager::538::ResourceManager::(releaseResource) Trying to release resource 'Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9'
Thread-1902159::DEBUG::2013-01-24 17:11:45,476::resourceManager::553::ResourceManager::(releaseResource) Released resource 'Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9' (0 active users)
Thread-1902159::DEBUG::2013-01-24 17:11:45,476::resourceManager::558::ResourceManager::(releaseResource) Resource 'Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9' is free, finding out if anyone is waiting for it.
Thread-1902159::DEBUG::2013-01-24 17:11:45,477::resourceManager::565::ResourceManager::(releaseResource) No one is waiting for resource 'Storage.5874cb6d-080b-4573-97d2-c926dae1fbc9', Clearing records.
Thread-1902159::ERROR::2013-01-24 17:11:45,477::dispatcher::66::Storage.Dispatcher.Protect::(run) {'status': {'message': "Storage domain already attached to pool: 'domain=cd23808b-136a-4b33-a80c-f2581eab022d, pool=d95c53ca-9cef-4db2-8858-bf4937bd8c14'", 'code': 380}}
Thread-1902167::DEBUG::2013-01-24 17:11:45,487::BindingXMLRPC::156::vds::(wrapper) [xxx.xxx.xxx.190]
_______________________________________________
Users mailing list
Users@ovirt.org
http://lists.ovirt.org/mailman/listinfo/users

Reply via email to