Status: New
Owner: ----

New issue 635 by [email protected]: gnt-backup export fails
http://code.google.com/p/ganeti/issues/detail?id=635

What software version are you running? Please provide the output of "gnt-
cluster --version", "gnt-cluster version", and "hspace --version".

What distribution are you using?
root@node2 ~ # cat /etc/debian_version
7.1
root@node2 ~ # aptitude show ganeti
[...]
Version: 2.7.1-3~bpo70+1
[...]
root@node2 ~ # gnt-cluster version
Software version: 2.7.1
Internode protocol: 2070000
Configuration format: 2070000
OS api version: 20
Export interface: 0
root@node2 ~ # gnt-cluster --version
gnt-cluster (ganeti v2.7.1) 2.7.1


What steps will reproduce the problem?
A. root@node2 ~ # gnt-backup export -n node2 web01
Mon Dec  9 21:59:37 2013 Shutting down instance web01.test.example.de
Mon Dec 9 21:59:43 2013 Creating a snapshot of disk/0 on node node2.test.example.de
Mon Dec  9 21:59:45 2013 Starting instance web01.test.oakey-net.de
Mon Dec 9 21:59:47 2013 Exporting snapshot/0 from node2.test.example.de to node2.test.example.de
Mon Dec  9 21:59:51 2013 snapshot/0 is now listening, starting export
Mon Dec  9 21:59:54 2013 snapshot/0 finished receiving data
Mon Dec 9 21:59:55 2013 - WARNING: export 'export-disk0-2013-12-09_21_59_53-IEzlwz' on node2.test.example.de failed: Exited with status 1 Mon Dec 9 21:59:55 2013 snapshot/0 failed to send data: Exited with status 1 (recent output: ) Mon Dec 9 21:59:55 2013 Removing snapshot of disk/0 on node node2.test.example.de
Mon Dec  9 21:59:59 2013 Finalizing export on node2.test.example.de
Failure: command execution error:
Export failed, errors in disk export: disk(s) 0

OR

B. root@node2 ~ # gnt-backup export -n node2 freebsd10
Mon Dec  9 21:57:48 2013 Shutting down instance freebsd10.test.example.de
Mon Dec 9 21:57:59 2013 Creating a snapshot of disk/0 on node node2.test.example.de
Mon Dec  9 21:58:01 2013 Starting instance freebsd10.test.oakey-net.de
Mon Dec 9 21:58:03 2013 Exporting snapshot/0 from node2.test.example.de to node2.test.example.de
Mon Dec  9 21:58:06 2013 snapshot/0 is now listening, starting export
Mon Dec  9 21:58:10 2013 snapshot/0 finished receiving data
Mon Dec 9 21:58:10 2013 - WARNING: export 'export-disk0-2013-12-09_21_58_09-ZPKeUB' on node2.test.example.de failed: Exited with status 1 Mon Dec 9 21:58:11 2013 snapshot/0 failed to send data: Exited with status 1 (recent output: device-mapper: remove ioctl on raidLVM-7f4fcc09--54a8--4a1d--8bd9--9b41057a7e38.disk0_data.snap-3 failed: Device or resource busy\ndevice-mapper: remove ioctl on raidLVM-7f4fcc09--54a8--4a1d--8bd9--9b41057a7e38.disk0_data.snap-2 failed: Device or resource busy) Mon Dec 9 21:58:11 2013 Removing snapshot of disk/0 on node node2.test.example.de Mon Dec 9 21:58:12 2013 - WARNING: Could not remove snapshot for disk/0 from node node2.test.example.de: Can't lvremove: exited with exit code 5 - Logical volume raidLVM/7f4fcc09-54a8-4a1d-8bd9-9b41057a7e38.disk0_data.snap is used by another device.\n
Mon Dec  9 21:58:12 2013 Finalizing export on node2.test.oakey-net.de
Mon Dec 9 21:58:12 2013 Removing snapshot of disk/0 on node node2.test.example.de Mon Dec 9 21:58:13 2013 - WARNING: Could not remove snapshot for disk/0 from node node2.test.example.de: Can't lvremove: exited with exit code 5 - Logical volume raidLVM/7f4fcc09-54a8-4a1d-8bd9-9b41057a7e38.disk0_data.snap is used by another device.\n
Failure: command execution error:
Export failed, errors in disk export: disk(s) 0


What is the expected output? What do you see instead?
a backup; no backup


Please provide any additional information below.
root@node2 ~ # df -h
[...]
/dev/mapper/raidLVM-backup 99G 188M 94G 1% /var/lib/ganeti/export

root@node2 ~ # l /var/lib/ganeti/export/web01.test.example.de
insgesamt 4
-rw------- 1 root root 0 Dez 9 21:59 992efdea-b3f5-42b2-a204-363bd08ecf90.disk0_data.snap
-rw------- 1 root root 1769 Dez  9 21:59 config.ini

root@node2 ~ # lvs
7f4fcc09-54a8-4a1d-8bd9-9b41057a7e38.disk0_data.snap raidLVM swi-aos- 20,00g 7f4fcc09-54a8-4a1d-8bd9-9b41057a7e38.disk0_data 0,01


--
You received this message because this project is configured to send all issue notifications to this address.
You may adjust your notification preferences at:
https://code.google.com/hosting/settings

Reply via email to