OK, so as I said I was going to do I've now gone through the logs.

I've place the log files into DropBox 
(https://www.dropbox.com/sh/eymwdy8hzn3sa7z/AACscSP2eaFfoiN-QzyeEVfaa?dl=0)

There was only one significant part of the logs (at least that what it appears 
to me) and I've included that extract below:

ovirt-hosted-engine-setup-ansible-bootstrap_local_vm-...log Extract

~~~
2022-11-01 21:34:57,395+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Notify the user about a 
failure'}
2022-11-01 21:34:57,395+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Notify the user about a failure  kwargs 
is_conditional:False 
2022-11-01 21:34:57,396+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Notify the user about a failure  kwargs 
2022-11-01 21:34:57,875+1100 INFO ansible skipped {'status': 'SKIPPED', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'Notify the user about a failure', 'ansible_host': 'localhost'}
2022-11-01 21:34:57,876+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b21370ee0>  kwargs 
2022-11-01 21:34:58,359+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Set host_id'}
2022-11-01 21:34:58,359+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Set host_id  kwargs is_conditional:False 
2022-11-01 21:34:58,360+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Set host_id  kwargs 
2022-11-01 21:34:58,844+1100 DEBUG var changed: host "localhost" var "host_id" 
type "<class 'ansible.utils.unsafe_proxy.AnsibleUnsafeText'>" value: 
""eb33e62a-2929-499f-80de-b7ac38a075f5""
2022-11-01 21:34:58,844+1100 INFO ansible ok {'status': 'OK', 'ansible_type': 
'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_host': 'localhost', 'ansible_task': 'Set host_id', 'task_duration': 0}
2022-11-01 21:34:58,844+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b214c4a00>  kwargs 
2022-11-01 21:34:59,288+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Collect error events from 
the Engine'}
2022-11-01 21:34:59,289+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Collect error events from the Engine  kwargs 
is_conditional:False 
2022-11-01 21:34:59,290+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Collect error events from the Engine  kwargs 
2022-11-01 21:35:00,157+1100 INFO ansible ok {'status': 'OK', 'ansible_type': 
'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_host': 'localhost', 'ansible_task': 'Collect error events from the 
Engine', 'task_duration': 1}
2022-11-01 21:35:00,157+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b214ae760>  kwargs 
2022-11-01 21:35:00,625+1100 DEBUG var changed: host "localhost" var 
"error_events" type "<class 'dict'>" value: "{
    "changed": false,
    "failed": false,
    "ovirt_events": [
        {
            "cluster": {
                "href": 
"/ovirt-engine/api/clusters/c44e2594-989d-4f1e-8308-feec46918d67",
                "id": "c44e2594-989d-4f1e-8308-feec46918d67",
                "name": "my_cluster_1"
            },
            "code": 532,
            "custom_id": -1,
            "description": "Used memory of host ovirt_node_1.mynet.local in 
cluster my_cluster_1 [100%] exceeded defined threshold [95%].",
            "flood_rate": 0,
            "host": {
                "href": 
"/ovirt-engine/api/hosts/eb33e62a-2929-499f-80de-b7ac38a075f5",
                "id": "eb33e62a-2929-499f-80de-b7ac38a075f5",
                "name": "ovirt_node_1.mynet.local"
            },
            "href": "/ovirt-engine/api/events/142",
            "id": "142",
            "index": 142,
            "origin": "oVirt",
            "severity": "warning",
            "time": "2022-11-01 21:34:57.640000+11:00"
        },
        {
            "cluster": {
                "href": 
"/ovirt-engine/api/clusters/c44e2594-989d-4f1e-8308-feec46918d67",
                "id": "c44e2594-989d-4f1e-8308-feec46918d67",
                "name": "my_cluster_1"
            },
            "code": 519,
            "correlation_id": "65a04e79",
            "custom_id": -1,
            "description": "Host ovirt_node_1.mynet.local does not comply with 
the cluster my_cluster_1 networks, the following networks are missing on host: 
'ovirtmgmt'",
            "flood_rate": 0,
            "host": {
                "href": 
"/ovirt-engine/api/hosts/eb33e62a-2929-499f-80de-b7ac38a075f5",
                "id": "eb33e62a-2929-499f-80de-b7ac38a075f5",
                "name": "ovirt_node_1.mynet.local"
            },
            "href": "/ovirt-engine/api/events/140",
            "id": "140",
            "index": 140,
            "origin": "oVirt",
            "severity": "warning",
            "time": "2022-11-01 21:34:57.404000+11:00",
            "user": {
                "name": "SYSTEM"
            }
        },
        {
            "cluster": {
                "href": 
"/ovirt-engine/api/clusters/c44e2594-989d-4f1e-8308-feec46918d67",
                "id": "c44e2594-989d-4f1e-8308-feec46918d67",
                "name": "my_cluster_1"
            },
            "code": 505,
            "correlation_id": "8a77e33b-64f8-409b-abe1-78a37ae6df4c",
            "custom_id": -1,
            "description": "Host ovirt_node_1.mynet.local installation failed. 
Failed to configure management network on the host.",
            "flood_rate": 0,
            "host": {
                "href": 
"/ovirt-engine/api/hosts/eb33e62a-2929-499f-80de-b7ac38a075f5",
                "id": "eb33e62a-2929-499f-80de-b7ac38a075f5",
                "name": "ovirt_node_1.mynet.local"
            },
            "href": "/ovirt-engine/api/events/137",
            "id": "137",
            "index": 137,
            "origin": "oVirt",
            "severity": "error",
            "time": "2022-11-01 21:34:51.506000+11:00",
            "user": {
                "href": 
"/ovirt-engine/api/users/66aaaec1-29f1-4e4f-a8c6-2a9c7f3319c2",
                "id": "66aaaec1-29f1-4e4f-a8c6-2a9c7f3319c2",
                "name": "admin@ovirt@internalkeycloak-authz"
            }
        },
        {
            "code": 1120,
            "custom_id": -1,
            "description": "Failed to configure management network on host 
ovirt_node_1.mynet.local due to setup networks failure.",
            "flood_rate": 0,
            "host": {
                "href": 
"/ovirt-engine/api/hosts/eb33e62a-2929-499f-80de-b7ac38a075f5",
                "id": "eb33e62a-2929-499f-80de-b7ac38a075f5",
                "name": "ovirt_node_1.mynet.local"
            },
            "href": "/ovirt-engine/api/events/136",
            "id": "136",
            "index": 136,
            "origin": "oVirt",
            "severity": "error",
            "time": "2022-11-01 21:34:51.413000+11:00"
        },
        {
            "code": 10802,
            "custom_id": -1,
            "description": "VDSM ovirt_node_1.mynet.local command 
HostSetupNetworksVDS failed: Internal JSON-RPC error: {'reason': 
\"\\ndesired\\n=======\\n---\\nname: bond_1\\ntype: bond\\nstate: up\\nipv4:\\n 
 enabled: false\\nipv6:\\n  enabled: false\\nlink-aggregation:\\n  mode: 
802.3ad\\n  options:\\n    downdelay: 1000\\n    miimon: 1000\\n    updelay: 
1000\\n  port:\\n  - eno1\\nmac-address: 3C:EC:EF:83:77:4C\\nmtu: 
1500\\n\\ncurrent\\n=======\\n---\\nname: bond_1\\ntype: bond\\nstate: 
up\\naccept-all-mac-addresses: false\\nethtool:\\n  feature:\\n    
esp-hw-offload: false\\n    esp-tx-csum-hw-offload: false\\n    highdma: 
true\\n    rx-gro: true\\n    rx-gro-list: false\\n    rx-lro: false\\n    
rx-udp-gro-forwarding: false\\n    rx-vlan-filter: true\\n    rx-vlan-hw-parse: 
true\\n    tx-checksum-ip-generic: true\\n    tx-esp-segmentation: false\\n    
tx-generic-segmentation: true\\n    tx-gre-csum-segmentation: true\\n    
tx-gre-segmentation: true\\n    tx-gso-list: false\\n
     tx-ipxip4-segmentation: true\\n    tx-ipxip6-segmentation: true\\n    
tx-nocache-copy: false\\n    tx-scatter-gather-fraglist: false\\n    
tx-sctp-segmentation: false\\n    tx-tcp-ecn-segmentation: true\\n    
tx-tcp-mangleid-segmentation: true\\n    tx-tcp-segmentation: true\\n    
tx-tcp6-segmentation: true\\n    tx-udp-segmentation: true\\n    
tx-udp_tnl-csum-segmentation: true\\n    tx-udp_tnl-segmentation: 
true\\nipv4:\\n  enabled: false\\nipv6:\\n  enabled: 
false\\nlink-aggregation:\\n  mode: 802.3ad\\n  options:\\n    
ad_actor_sys_prio: 65535\\n    ad_actor_system: 00:00:00:00:00:00\\n    
ad_select: stable\\n    ad_user_port_key: 0\\n    all_slaves_active: dropped\\n 
   arp_all_targets: any\\n    arp_interval: 0\\n    arp_ip_target: ''\\n    
arp_validate: none\\n    downdelay: 1000\\n    lacp_rate: slow\\n    miimon: 
1000\\n    min_links: 0\\n    updelay: 1000\\n    use_carrier: true\\n    
xmit_hash_policy: layer2\\n  port:\\n  - eno1\\n  - eno2\\nlldp:\\n  enabled: 
false\\
 nmac-address: 3C:EC:EF:83:77:4C\\nmtu: 1500\\n\\ndifference\\n==========\\n--- 
desired\\n+++ current\\n@@ -2,6 +2,36 @@\\n name: bond_1\\n type: bond\\n 
state: up\\n+accept-all-mac-addresses: false\\n+ethtool:\\n+  feature:\\n+    
esp-hw-offload: false\\n+    esp-tx-csum-hw-offload: false\\n+    highdma: 
true\\n+    rx-gro: true\\n+    rx-gro-list: false\\n+    rx-lro: false\\n+    
rx-udp-gro-forwarding: false\\n+    rx-vlan-filter: true\\n+    
rx-vlan-hw-parse: true\\n+    tx-checksum-ip-generic: true\\n+    
tx-esp-segmentation: false\\n+    tx-generic-segmentation: true\\n+    
tx-gre-csum-segmentation: true\\n+    tx-gre-segmentation: true\\n+    
tx-gso-list: false\\n+    tx-ipxip4-segmentation: true\\n+    
tx-ipxip6-segmentation: true\\n+    tx-nocache-copy: false\\n+    
tx-scatter-gather-fraglist: false\\n+    tx-sctp-segmentation: false\\n+    
tx-tcp-ecn-segmentation: true\\n+    tx-tcp-mangleid-segmentation: true\\n+    
tx-tcp-segmentation: true\\n+    tx-tcp6-segmentation: tr
 ue\\n+    tx-udp-segmentation: true\\n+    tx-udp_tnl-csum-segmentation: 
true\\n+    tx-udp_tnl-segmentation: true\\n ipv4:\\n   enabled: false\\n 
ipv6:\\n@@ -9,10 +39,26 @@\\n link-aggregation:\\n   mode: 802.3ad\\n   
options:\\n+    ad_actor_sys_prio: 65535\\n+    ad_actor_system: 
00:00:00:00:00:00\\n+    ad_select: stable\\n+    ad_user_port_key: 0\\n+    
all_slaves_active: dropped\\n+    arp_all_targets: any\\n+    arp_interval: 
0\\n+    arp_ip_target: ''\\n+    arp_validate: none\\n     downdelay: 1000\\n+ 
   lacp_rate: slow\\n     miimon: 1000\\n+    min_links: 0\\n     updelay: 
1000\\n+    use_carrier: true\\n+    xmit_hash_policy: layer2\\n   port:\\n   - 
eno1\\n+  - eno2\\n+lldp:\\n+  enabled: false\\n mac-address: 
3C:EC:EF:83:77:4C\\n mtu: 1500\\n\\n\"}",
            "flood_rate": 0,
            "host": {
                "href": 
"/ovirt-engine/api/hosts/eb33e62a-2929-499f-80de-b7ac38a075f5",
                "id": "eb33e62a-2929-499f-80de-b7ac38a075f5",
                "name": "ovirt_node_1.mynet.local"
            },
            "href": "/ovirt-engine/api/events/135",
            "id": "135",
            "index": 135,
            "origin": "oVirt",
            "severity": "error",
            "time": "2022-11-01 21:34:51.387000+11:00"
        },
        {
            "cluster": {
                "href": 
"/ovirt-engine/api/clusters/c44e2594-989d-4f1e-8308-feec46918d67",
                "id": "c44e2594-989d-4f1e-8308-feec46918d67",
                "name": "my_cluster_1"
            },
            "code": 553,
            "correlation_id": "8a77e33b-64f8-409b-abe1-78a37ae6df4c",
            "custom_id": -1,
            "description": "Installing Host ovirt_node_1.mynet.local. Check for 
LVM filter configuration error: Cannot configure LVM filter on host, please 
run: vdsm-tool config-lvm-filter.",
            "flood_rate": 0,
            "host": {
                "href": 
"/ovirt-engine/api/hosts/eb33e62a-2929-499f-80de-b7ac38a075f5",
                "id": "eb33e62a-2929-499f-80de-b7ac38a075f5",
                "name": "ovirt_node_1.mynet.local"
            },
            "href": "/ovirt-engine/api/events/59",
            "id": "59",
            "index": 59,
            "origin": "oVirt",
            "severity": "error",
            "time": "2022-11-01 21:32:19.692000+11:00"
        },
        {
            "code": 9000,
            "custom_id": -1,
            "description": "Failed to verify Power Management configuration for 
Host ovirt_node_1.mynet.local.",
            "flood_rate": 0,
            "host": {
                "href": 
"/ovirt-engine/api/hosts/eb33e62a-2929-499f-80de-b7ac38a075f5",
                "id": "eb33e62a-2929-499f-80de-b7ac38a075f5",
                "name": "ovirt_node_1.mynet.local"
            },
            "href": "/ovirt-engine/api/events/13",
            "id": "13",
            "index": 13,
            "origin": "oVirt",
            "severity": "alert",
            "time": "2022-11-01 21:31:44.645000+11:00"
        },
        {
            "code": 11291,
            "custom_id": -1,
            "description": "Update to network ovirtmgmt was not applied to 
virtual network interfaces [<UNKNOWN>]. The actual configuration on the 
interfaces may differ from the displayed one.",
            "flood_rate": 0,
            "href": "/ovirt-engine/api/events/10",
            "id": "10",
            "index": 10,
            "origin": "oVirt",
            "severity": "alert",
            "time": "2022-11-01 21:31:35.606000+11:00"
        }
    ]
}"
2022-11-01 21:35:00,625+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Generate the error message 
from the engine events'}
2022-11-01 21:35:00,625+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Generate the error message from the engine 
events  kwargs is_conditional:False 
2022-11-01 21:35:00,626+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Generate the error message from the engine 
events  kwargs 
2022-11-01 21:35:01,125+1100 DEBUG var changed: host "localhost" var 
"error_description" type "<class 
'ansible.utils.unsafe_proxy.AnsibleUnsafeText'>" value: ""  code 505: Host 
ovirt_node_1.mynet.local installation failed. Failed to configure management 
network on the host.,    code 519: Host ovirt_node_1.mynet.local does not 
comply with the cluster my_cluster_1 networks, the following networks are 
missing on host: 'ovirtmgmt',    code 532: Used memory of host 
ovirt_node_1.mynet.local in cluster my_cluster_1 [100%] exceeded defined 
threshold [95%].,    code 553: Installing Host ovirt_node_1.mynet.local. Check 
for LVM filter configuration error: Cannot configure LVM filter on host, please 
run: vdsm-tool config-lvm-filter.,    code 1120: Failed to configure management 
network on host ovirt_node_1.mynet.local due to setup networks failure.,    
code 9000: Failed to verify Power Management configuration for Host 
ovirt_node_1.mynet.local.,    code 10802: VDSM ovirt_node_1.mynet.local comma
 nd HostSetupNetworksVDS failed: Internal JSON-RPC error: {'reason': 
\"\\ndesired\\n=======\\n---\\nname: bond_1\\ntype: bond\\nstate: up\\nipv4:\\n 
 enabled: false\\nipv6:\\n  enabled: false\\nlink-aggregation:\\n  mode: 
802.3ad\\n  options:\\n    downdelay: 1000\\n    miimon: 1000\\n    updelay: 
1000\\n  port:\\n  - eno1\\nmac-address: 3C:EC:EF:83:77:4C\\nmtu: 
1500\\n\\ncurrent\\n=======\\n---\\nname: bond_1\\ntype: bond\\nstate: 
up\\naccept-all-mac-addresses: false\\nethtool:\\n  feature:\\n    
esp-hw-offload: false\\n    esp-tx-csum-hw-offload: false\\n    highdma: 
true\\n    rx-gro: true\\n    rx-gro-list: false\\n    rx-lro: false\\n    
rx-udp-gro-forwarding: false\\n    rx-vlan-filter: true\\n    rx-vlan-hw-parse: 
true\\n    tx-checksum-ip-generic: true\\n    tx-esp-segmentation: false\\n    
tx-generic-segmentation: true\\n    tx-gre-csum-segmentation: true\\n    
tx-gre-segmentation: true\\n    tx-gso-list: false\\n    
tx-ipxip4-segmentation: true\\n    tx-ipxip6-segmentation:
  true\\n    tx-nocache-copy: false\\n    tx-scatter-gather-fraglist: false\\n  
  tx-sctp-segmentation: false\\n    tx-tcp-ecn-segmentation: true\\n    
tx-tcp-mangleid-segmentation: true\\n    tx-tcp-segmentation: true\\n    
tx-tcp6-segmentation: true\\n    tx-udp-segmentation: true\\n    
tx-udp_tnl-csum-segmentation: true\\n    tx-udp_tnl-segmentation: 
true\\nipv4:\\n  enabled: false\\nipv6:\\n  enabled: 
false\\nlink-aggregation:\\n  mode: 802.3ad\\n  options:\\n    
ad_actor_sys_prio: 65535\\n    ad_actor_system: 00:00:00:00:00:00\\n    
ad_select: stable\\n    ad_user_port_key: 0\\n    all_slaves_active: dropped\\n 
   arp_all_targets: any\\n    arp_interval: 0\\n    arp_ip_target: ''\\n    
arp_validate: none\\n    downdelay: 1000\\n    lacp_rate: slow\\n    miimon: 
1000\\n    min_links: 0\\n    updelay: 1000\\n    use_carrier: true\\n    
xmit_hash_policy: layer2\\n  port:\\n  - eno1\\n  - eno2\\nlldp:\\n  enabled: 
false\\nmac-address: 3C:EC:EF:83:77:4C\\nmtu: 1500\\n\\ndifference\\n
 ==========\\n--- desired\\n+++ current\\n@@ -2,6 +2,36 @@\\n name: bond_1\\n 
type: bond\\n state: up\\n+accept-all-mac-addresses: false\\n+ethtool:\\n+  
feature:\\n+    esp-hw-offload: false\\n+    esp-tx-csum-hw-offload: false\\n+  
  highdma: true\\n+    rx-gro: true\\n+    rx-gro-list: false\\n+    rx-lro: 
false\\n+    rx-udp-gro-forwarding: false\\n+    rx-vlan-filter: true\\n+    
rx-vlan-hw-parse: true\\n+    tx-checksum-ip-generic: true\\n+    
tx-esp-segmentation: false\\n+    tx-generic-segmentation: true\\n+    
tx-gre-csum-segmentation: true\\n+    tx-gre-segmentation: true\\n+    
tx-gso-list: false\\n+    tx-ipxip4-segmentation: true\\n+    
tx-ipxip6-segmentation: true\\n+    tx-nocache-copy: false\\n+    
tx-scatter-gather-fraglist: false\\n+    tx-sctp-segmentation: false\\n+    
tx-tcp-ecn-segmentation: true\\n+    tx-tcp-mangleid-segmentation: true\\n+    
tx-tcp-segmentation: true\\n+    tx-tcp6-segmentation: true\\n+    
tx-udp-segmentation: true\\n+    tx-udp_tnl-csum-seg
 mentation: true\\n+    tx-udp_tnl-segmentation: true\\n ipv4:\\n   enabled: 
false\\n ipv6:\\n@@ -9,10 +39,26 @@\\n link-aggregation:\\n   mode: 802.3ad\\n  
 options:\\n+    ad_actor_sys_prio: 65535\\n+    ad_actor_system: 
00:00:00:00:00:00\\n+    ad_select: stable\\n+    ad_user_port_key: 0\\n+    
all_slaves_active: dropped\\n+    arp_all_targets: any\\n+    arp_interval: 
0\\n+    arp_ip_target: ''\\n+    arp_validate: none\\n     downdelay: 1000\\n+ 
   lacp_rate: slow\\n     miimon: 1000\\n+    min_links: 0\\n     updelay: 
1000\\n+    use_carrier: true\\n+    xmit_hash_policy: layer2\\n   port:\\n   - 
eno1\\n+  - eno2\\n+lldp:\\n+  enabled: false\\n mac-address: 
3C:EC:EF:83:77:4C\\n mtu: 1500\\n\\n\"},    ""
2022-11-01 21:35:01,125+1100 INFO ansible ok {'status': 'OK', 'ansible_type': 
'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_host': 'localhost', 'ansible_task': 'Generate the error message from 
the engine events', 'task_duration': 0}
2022-11-01 21:35:01,125+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b21370ee0>  kwargs 
2022-11-01 21:35:01,583+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Notify with error 
description'}
2022-11-01 21:35:01,583+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Notify with error description  kwargs 
is_conditional:False 
2022-11-01 21:35:01,584+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Notify with error description  kwargs 
2022-11-01 21:35:02,032+1100 INFO ansible ok {'status': 'OK', 'ansible_type': 
'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_host': 'localhost', 'ansible_task': 'Notify with error description', 
'task_duration': 0}
2022-11-01 21:35:02,032+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b214ae790>  kwargs 
2022-11-01 21:35:02,511+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Notify with generic error'}
2022-11-01 21:35:02,511+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Notify with generic error  kwargs 
is_conditional:False 
2022-11-01 21:35:02,512+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Notify with generic error  kwargs 
2022-11-01 21:35:02,954+1100 INFO ansible skipped {'status': 'SKIPPED', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'Notify with generic error', 'ansible_host': 'localhost'}
2022-11-01 21:35:02,954+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b21077520>  kwargs 
2022-11-01 21:35:03,401+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Let the user connect to the 
bootstrap engine to manually fix host configuration'}
2022-11-01 21:35:03,402+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Let the user connect to the bootstrap engine 
to manually fix host configuration  kwargs is_conditional:False 
2022-11-01 21:35:03,402+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Let the user connect to the bootstrap engine 
to manually fix host configuration  kwargs 
2022-11-01 21:35:03,843+1100 INFO ansible ok {'status': 'OK', 'ansible_type': 
'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_host': 'localhost', 'ansible_task': 'Let the user connect to the 
bootstrap engine to manually fix host configuration', 'task_duration': 0}
2022-11-01 21:35:03,844+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b214ae550>  kwargs 
2022-11-01 21:35:04,322+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : include_tasks'}
2022-11-01 21:35:04,323+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : include_tasks  kwargs is_conditional:False 
2022-11-01 21:35:04,323+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : include_tasks  kwargs 
2022-11-01 21:35:04,772+1100 INFO ansible ok {'status': 'OK', 'ansible_type': 
'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_host': 'localhost', 'ansible_task': '', 'task_duration': 0}
2022-11-01 21:35:04,772+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b2154e700>  kwargs 
2022-11-01 21:35:04,798+1100 DEBUG ansible on_any args 
/usr/share/ansible/collections/ansible_collections/ovirt/ovirt/roles/hosted_engine_setup/tasks/pause_execution.yml
 (args={} vars={}): [localhost]  kwargs 
2022-11-01 21:35:05,260+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Create temporary lock file'}
2022-11-01 21:35:05,260+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Create temporary lock file  kwargs 
is_conditional:False 
2022-11-01 21:35:05,261+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Create temporary lock file  kwargs 
2022-11-01 21:35:05,889+1100 INFO ansible ok {'status': 'OK', 'ansible_type': 
'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_host': 'localhost', 'ansible_task': 'Create temporary lock file', 
'task_duration': 1}
2022-11-01 21:35:05,890+1100 DEBUG ansible on_any args 
<ansible.executor.task_result.TaskResult object at 0x7f0b2120eca0>  kwargs 
2022-11-01 21:35:06,357+1100 DEBUG var changed: host "localhost" var 
"he_setup_lock_file" type "<class 'dict'>" value: "{
    "changed": true,
    "failed": false,
    "gid": 0,
    "group": "root",
    "mode": "0600",
    "owner": "root",
    "path": "/tmp/ansible.volt5pvv_he_setup_lock",
    "secontext": "unconfined_u:object_r:user_tmp_t:s0",
    "size": 0,
    "state": "file",
    "uid": 0
}"
2022-11-01 21:35:06,358+1100 INFO ansible task start {'status': 'OK', 
'ansible_type': 'task', 'ansible_playbook': 
'/usr/share/ovirt-hosted-engine-setup/he_ansible/trigger_role.yml', 
'ansible_task': 'ovirt.ovirt.hosted_engine_setup : Pause execution until 
/tmp/ansible.volt5pvv_he_setup_lock is removed, delete it once ready to 
proceed'}
2022-11-01 21:35:06,358+1100 DEBUG ansible on_any args TASK: 
ovirt.ovirt.hosted_engine_setup : Pause execution until 
/tmp/ansible.volt5pvv_he_setup_lock is removed, delete it once ready to proceed 
 kwargs is_conditional:False 
2022-11-01 21:35:06,359+1100 DEBUG ansible on_any args localhost TASK: 
ovirt.ovirt.hosted_engine_setup : Pause execution until {{ 
he_setup_lock_file.path }} is removed, delete it once ready to proceed  kwargs 
~~~

So from this I can see a couple of things:

1) There's a warning about memory usage of the host (its got 64 GiB BTW).
2) There's a warning that 'ovirtmgmt' is missing from the host (I thought that 
that network was automatically created).
3) There's an error that: "Host ovirt_node_1.mynet.local installation failed. 
Failed to configure management network on the host." (this, I assume, is 
related to the above).
4) There's an error that: "Failed to configure management network on host 
ovirt_node_1.mynet.local due to setup networks failure."
5) There's an error that vdsm doesn't "like" the bond that I've got set up. I 
can't work out why, though, because what the script requires is met and 
exceeded by what is actually there (at least that's how I interpreted it - but, 
as always, I may be wrong).
6) There's an error that: "Installing Host ovirt_node_1.mynet.local. Check for 
LVM filter configuration error: Cannot configure LVM filter on host, please 
run: vdsm-tool config-lvm-filter." Running that command tells me that there are 
no LVM filters in place - no, I haven't put them in place yet: I'm waiting to 
hear what is said here.
7) There's an alert that: "Failed to verify Power Management configuration for 
Host ovirt_node_1.mynet.local." I haven't set up Power Management yet, as I 
thought that needed to be done from inside the Engine, & I'm not at that stage 
yet.
8) There's an alert that: "Update to network ovirtmgmt was not applied to 
virtual network interfaces [<UNKNOWN>]. The actual configuration on the 
interfaces may differ from the displayed one."

Also, I haven't been able to connect to the Web GUI. I suspect that that is 
some sort of firewall/connection error because I can't ping the Engine from the 
network, but can ping it from the host (which is headless, so no GUI on it) and 
I can ping to any node on the network from the engine. Not sure how to fix this 
one.

Also, unless I've got my wires completely crossed, the port that we're 
suppossed to conect to on the Engine is port 9090 (cockpit) and there's a 
recent issue (like, literally in the last 15-odd days recent) with cockpit 
bitching and refusing to work with the latest versions of Firefox & Chrome 
(https://cockpit-project.org/blog/login-issues.html) - and it looks like 
there's a fix (I just checked when I grabbed the URL), but is that new version 
in the (EL8) repos?

So, what I'm going to do is strip everything out of the 3 self-hosted nodes 
(software-wise) and start from scratch - tomorrow (its 21:00 local here @ the 
moment). In the meantime, if people would be kind enough to comment with 
fixes/suggestions, I'd appreciate it. :-)

Cheers

Dulux-Oz
_______________________________________________
Users mailing list -- users@ovirt.org
To unsubscribe send an email to users-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/users@ovirt.org/message/GVUOLBFLSBCKJVVKR7C366AS6DADCL6T/

Reply via email to