David Johnson <djohn...@maxistechnology.com> writes:

> Good afternoon all,
>
> We are trying to update our cluster from 4.5.2 to see if we can
> resolve some issues with the VM Web console not functioning on VM's that
> run on one of the hosts. So there's two problems here, if we can resolve
> the one (dependency resolution) we are hoping that we can resolve the other
> with a reinstall of the software.
>
> *Symptoms:*
> 1. On VM's running one one host, the VM web console does not work.  The
> console.vv downloads to the desktop and we can attempt to launch it.  On
> launch, it immediately exits. The web console works on the VM's on the
> other host.
>
> 2. Attempting to update or reinstall the software to any host via the ovirt
> Compute -> Hosts -> Installation -> Reinstall, or Upgrade menu, we get a
> dependency resolution error:
> package ovirt-openvswitch-2.15-4.el8.noarch requires openvswitch2.15, but
> none of the providers can be installed\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-117.el8s.x86_64

I don't know what's the official solution, but a workaround I use is
adding the following packages to `exclude' section of
[ovirt-*-centos-stream-openstack-yoga-testing] repo at the end of
/etc/yum.repos.d/ovirt-*-dependencies.repo on the host:

 rdo-openvswitch
 rdo-ovn
 rdo-ovn-host
 python3-rdo-openvswitch

Regards,
Milan

> It appears that the 4.5.2 build is running on an older release of
> openvswitch?
>
> Please advise.
>
>
> *Environment:*
> Production environment: Ovirt 4.5.2.4-1.el8
> 1 Standalone engine (upgraded to
> 3 hosts
>
> *Log excerpts*
> *VM Web Console Not Starting:*
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for fe80::f0ca:56ff:fe8c:7bb8 on veth8a5345f.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for veth8a5345f.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for fe80::d879:d5ff:fee4:1855 on vethc9bd12f.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for vethc9bd12f.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for fe80::42:15ff:fe5a:679 on br-1feb13c47a4f.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for 172.22.0.1 on br-1feb13c47a4f.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for br-1feb13c47a4f.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for 172.19.0.1 on docker_gwbridge.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for docker_gwbridge.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for 172.17.0.1 on docker0.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for docker0.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for 172.18.0.1 on br-4209e789b982.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for br-4209e789b982.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for virbr0-nic.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for 192.168.122.1 on virbr0.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for virbr0.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing address
> record for 192.168.2.163 on eth0.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for eth0.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Withdrawing
> workstation service for lo.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Host name conflict,
> retrying with cen-76-alc-qa-4236
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for fe80::f0ca:56ff:fe8c:7bb8 on veth8a5345f.*.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for fe80::d879:d5ff:fee4:1855 on vethc9bd12f.*.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for fe80::42:15ff:fe5a:679 on br-1feb13c47a4f.*.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for 172.22.0.1 on br-1feb13c47a4f.IPv4.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for 172.19.0.1 on docker_gwbridge.IPv4.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for 172.17.0.1 on docker0.IPv4.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for 172.18.0.1 on br-4209e789b982.IPv4.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for 192.168.122.1 on virbr0.IPv4.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for fe80::3a72:e773:4d49:55ba on eth0.*.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering new
> address record for 192.168.2.163 on eth0.IPv4.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Server startup
> complete. Host name is cen-76-alc-qa-4236.local. Local service cookie is
> 3875159144.
> Oct 18 13:22:02 cen-76-alc-qa-163 avahi-daemon[780]: Registering HINFO
> record with values 'X86_64'/'LINUX'.
> Oct 18 13:22:02 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:02.828860278-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:04.695827803-05:00" level=error msg="agent: session
> failed" backoff=8s error="rpc error: code = Unavailable desc = connection
> error: desc = \"transport: Error while dialing dial tcp 192.168.2.162:2377:
> connect: connection refused\"" module=node/agent node.id
> =iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:04.695994815-05:00" level=info msg="parsed scheme:
> \"\"" module=grpc
> Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:04.696023188-05:00" level=info msg="scheme \"\" not
> registered, fallback to default scheme" module=grpc
> Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:04.696208750-05:00" level=info
> msg="ccResolverWrapper: sending update to cc: {[{192.168.2.162:2377  <nil>
> 0 <nil>}] <nil> <nil>}" module=grpc
> Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:04.696239844-05:00" level=info msg="ClientConn
> switching balancer to \"pick_first\"" module=grpc
> Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:04.696294355-05:00" level=info msg="manager selected
> by agent for new session: {z80ib665fps13yuqtzqqnltr6 192.168.2.162:2377}"
> module=node/agent node.id=iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:04.696355540-05:00" level=info msg="waiting
> 6.545410435s before registering session" module=node/agent node.id
> =iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:04 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:04.697629711-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:05 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:05.699785481-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:07 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:07.247677166-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:10 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:10.062182617-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:11.242738688-05:00" level=error msg="agent: session
> failed" backoff=8s error="rpc error: code = Unavailable desc = connection
> error: desc = \"transport: Error while dialing dial tcp 192.168.2.162:2377:
> connect: connection refused\"" module=node/agent node.id
> =iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:11.242918867-05:00" level=info msg="parsed scheme:
> \"\"" module=grpc
> Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:11.242954837-05:00" level=info msg="scheme \"\" not
> registered, fallback to default scheme" module=grpc
> Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:11.243141828-05:00" level=info
> msg="ccResolverWrapper: sending update to cc: {[{192.168.2.162:2377  <nil>
> 0 <nil>}] <nil> <nil>}" module=grpc
> Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:11.243170522-05:00" level=info msg="ClientConn
> switching balancer to \"pick_first\"" module=grpc
> Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:11.243226320-05:00" level=info msg="manager selected
> by agent for new session: {z80ib665fps13yuqtzqqnltr6 192.168.2.162:2377}"
> module=node/agent node.id=iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:11.243290968-05:00" level=info msg="waiting
> 3.202102632s before registering session" module=node/agent node.id
> =iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:11 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:11.244389573-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:12 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:12.246550975-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.073299399-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.446614366-05:00" level=error msg="agent: session
> failed" backoff=8s error="rpc error: code = Unavailable desc = connection
> error: desc = \"transport: Error while dialing dial tcp 192.168.2.162:2377:
> connect: connection refused\"" module=node/agent node.id
> =iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.446832560-05:00" level=info msg="parsed scheme:
> \"\"" module=grpc
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.446863564-05:00" level=info msg="scheme \"\" not
> registered, fallback to default scheme" module=grpc
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.447060785-05:00" level=info
> msg="ccResolverWrapper: sending update to cc: {[{192.168.2.162:2377  <nil>
> 0 <nil>}] <nil> <nil>}" module=grpc
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.447089843-05:00" level=info msg="ClientConn
> switching balancer to \"pick_first\"" module=grpc
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.447142554-05:00" level=info msg="manager selected
> by agent for new session: {z80ib665fps13yuqtzqqnltr6 192.168.2.162:2377}"
> module=node/agent node.id=iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.447206125-05:00" level=info msg="waiting
> 3.65571267s before registering session" module=node/agent node.id
> =iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:14 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:14.448787529-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:15 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:15.450286831-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:17 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:17.254956187-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:18.103309534-05:00" level=error msg="agent: session
> failed" backoff=8s error="rpc error: code = Unavailable desc = connection
> error: desc = \"transport: Error while dialing dial tcp 192.168.2.162:2377:
> connect: connection refused\"" module=node/agent node.id
> =iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:18.103525940-05:00" level=info msg="parsed scheme:
> \"\"" module=grpc
> Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:18.103557034-05:00" level=info msg="scheme \"\" not
> registered, fallback to default scheme" module=grpc
> Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:18.103843903-05:00" level=info
> msg="ccResolverWrapper: sending update to cc: {[{192.168.2.162:2377  <nil>
> 0 <nil>}] <nil> <nil>}" module=grpc
> Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:18.103880439-05:00" level=info msg="ClientConn
> switching balancer to \"pick_first\"" module=grpc
> Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:18.103970819-05:00" level=info msg="manager selected
> by agent for new session: {z80ib665fps13yuqtzqqnltr6 192.168.2.162:2377}"
> module=node/agent node.id=iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:18.104034386-05:00" level=info msg="waiting
> 3.770112424s before registering session" module=node/agent node.id
> =iz0s4qedzrsxkdcbq6nzvxyfj
> Oct 18 13:22:18 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:18.105363177-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:19 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:19.107464606-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
> Oct 18 13:22:20 cen-76-alc-qa-163 dockerd:
> time="2022-10-18T13:22:20.460645439-05:00" level=warning msg="grpc:
> addrConn.createTransport failed to connect to {192.168.2.162:2377  <nil> 0
> <nil>}. Err :connection error: desc = \"transport: Error while dialing dial
> tcp 192.168.2.162:2377: connect: connection refused\". Reconnecting..."
> module=grpc
>
> *Host log from update attempt:*
> 2022-10-17 13:57:22 CDT - {
>   "uuid" : "71edb069-f43c-452b-8528-a0c3d6625fd1",
>   "counter" : 175,
>   "stdout" : "fatal: [192.168.2.18]: FAILED! => {\"changed\": false,
> \"failures\": [], \"msg\": \"Depsolve Error occurred: \\n Problem 1:
> package ovirt-openvswitch-2.15-4.el8.noarch requires openvswitch2.15, but
> none of the providers can be installed\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-117.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-106.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-110.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-115.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-119.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-22.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-23.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-24.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-27.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-30.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-32.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-35.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-37.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-39.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-41.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-47.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-48.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-51.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-52.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-53.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-54.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-56.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-6.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-72.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-75.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-80.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-81.el8s.x86_64\\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-88.el8s.x86_64\\n  - cannot install the
> best update candidate for package ovirt-openvswitch-2.15-4.el8.noarch\\n  -
> cannot install the best update candidate for package
> openvswitch2.15-2.15.0-117.el8s.x86_64\\n Problem 2: package
> python3-rdo-openvswitch-2:2.17-3.el8.noarch obsoletes
> python3-openvswitch2.15 < 2.17 provided by
> python3-openvswitch2.15-2.15.0-119.el8s.x86_64\\n  - package
> openvswitch2.15-ipsec-2.15.0-119.el8s.x86_64 requires
> python3-openvswitch2.15 = 2.15.0-119.el8s, but none of the providers can be
> installed\\n  - cannot install the best update candidate for package
> python3-openvswitch2.15-2.15.0-117.el8s.x86_64\\n  - cannot install the
> best update candidate for package
> openvswitch2.15-ipsec-2.15.0-117.el8s.x86_64\\n Problem 3: package
> ovirt-openvswitch-ovn-common-2.15-4.el8.noarch requires ovn-2021, but none
> of the providers can be installed\\n  - package
> rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
> ovn-2021-21.12.0-82.el8s.x86_64\\n  - package rdo-ovn-2:22.06-3.el8.noarch
> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.03.0-21.el8s.x86_64\\n
>  - package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided
> by ovn-2021-21.03.0-40.el8s.x86_64\\n  - package
> rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
> ovn-2021-21.06.0-17.el8s.x86_64\\n  - package rdo-ovn-2:22.06-3.el8.noarch
> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.06.0-29.el8s.x86_64\\n
>  - package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided
> by ovn-2021-21.12.0-11.el8s.x86_64\\n  - cannot install the best update
> candidate for package ovn-2021-21.12.0-82.el8s.x86_64\\n  - cannot install
> the best update candidate for package
> ovirt-openvswitch-ovn-common-2.15-4.el8.noarch\\n Problem 4: package
> ovirt-openvswitch-ovn-host-2.15-4.el8.noarch requires ovn-2021-host, but
> none of the providers can be installed\\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.12.0-82.el8s.x86_64\\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.03.0-21.el8s.x86_64\\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.03.0-40.el8s.x86_64\\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.06.0-17.el8s.x86_64\\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.06.0-29.el8s.x86_64\\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.12.0-11.el8s.x86_64\\n  - cannot install the best
> update candidate for package ovn-2021-host-21.12.0-82.el8s.x86_64\\n  -
> cannot install the best update candidate for package
> ovirt-openvswitch-ovn-host-2.15-4.el8.noarch\", \"rc\": 1, \"results\":
> []}",
>   "start_line" : 171,
>   "end_line" : 172,
>   "runner_ident" : "e370b346-b9e8-4a0c-8595-9a717b14c901",
>   "event" : "runner_on_failed",
>   "pid" : 3151,
>   "created" : "2022-10-17T18:57:22.572472",
>   "parent_uuid" : "64006a61-d37f-a54d-f807-000000000042",
>   "event_data" : {
>     "playbook" : "ovirt-host-upgrade.yml",
>     "playbook_uuid" : "b4383145-d80b-4cf6-83a2-550febf56e5c",
>     "play" : "all",
>     "play_uuid" : "64006a61-d37f-a54d-f807-000000000006",
>     "play_pattern" : "all",
>     "task" : "Upgrade packages",
>     "task_uuid" : "64006a61-d37f-a54d-f807-000000000042",
>     "task_action" : "yum",
>     "task_args" : "",
>     "task_path" :
> "/usr/share/ovirt-engine/ansible-runner-service-project/project/roles/ovirt-host-upgrade/tasks/main.yml:49",
>     "role" : "ovirt-host-upgrade",
>     "host" : "192.168.2.18",
>     "remote_addr" : "192.168.2.18",
>     "res" : {
>       "failures" : [ ],
>       "results" : [ ],
>       "rc" : 1,
>       "msg" : "Depsolve Error occurred: \n Problem 1: package
> ovirt-openvswitch-2.15-4.el8.noarch requires openvswitch2.15, but none of
> the providers can be installed\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-117.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-106.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-110.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-115.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-119.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-22.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-23.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-24.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-27.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-30.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-32.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-35.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-37.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-39.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-41.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-47.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-48.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-51.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-52.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-53.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-54.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-56.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-6.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-72.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-75.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-80.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-81.el8s.x86_64\n  - package
> rdo-openvswitch-2:2.17-3.el8.noarch obsoletes openvswitch2.15 < 2.17
> provided by openvswitch2.15-2.15.0-88.el8s.x86_64\n  - cannot install the
> best update candidate for package ovirt-openvswitch-2.15-4.el8.noarch\n  -
> cannot install the best update candidate for package
> openvswitch2.15-2.15.0-117.el8s.x86_64\n Problem 2: package
> python3-rdo-openvswitch-2:2.17-3.el8.noarch obsoletes
> python3-openvswitch2.15 < 2.17 provided by
> python3-openvswitch2.15-2.15.0-119.el8s.x86_64\n  - package
> openvswitch2.15-ipsec-2.15.0-119.el8s.x86_64 requires
> python3-openvswitch2.15 = 2.15.0-119.el8s, but none of the providers can be
> installed\n  - cannot install the best update candidate for package
> python3-openvswitch2.15-2.15.0-117.el8s.x86_64\n  - cannot install the best
> update candidate for package openvswitch2.15-ipsec-2.15.0-117.el8s.x86_64\n
> Problem 3: package ovirt-openvswitch-ovn-common-2.15-4.el8.noarch requires
> ovn-2021, but none of the providers can be installed\n  - package
> rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
> ovn-2021-21.12.0-82.el8s.x86_64\n  - package rdo-ovn-2:22.06-3.el8.noarch
> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.03.0-21.el8s.x86_64\n  -
> package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
> ovn-2021-21.03.0-40.el8s.x86_64\n  - package rdo-ovn-2:22.06-3.el8.noarch
> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.06.0-17.el8s.x86_64\n  -
> package rdo-ovn-2:22.06-3.el8.noarch obsoletes ovn-2021 < 22.06 provided by
> ovn-2021-21.06.0-29.el8s.x86_64\n  - package rdo-ovn-2:22.06-3.el8.noarch
> obsoletes ovn-2021 < 22.06 provided by ovn-2021-21.12.0-11.el8s.x86_64\n  -
> cannot install the best update candidate for package
> ovn-2021-21.12.0-82.el8s.x86_64\n  - cannot install the best update
> candidate for package ovirt-openvswitch-ovn-common-2.15-4.el8.noarch\n
> Problem 4: package ovirt-openvswitch-ovn-host-2.15-4.el8.noarch requires
> ovn-2021-host, but none of the providers can be installed\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.12.0-82.el8s.x86_64\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.03.0-21.el8s.x86_64\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.03.0-40.el8s.x86_64\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.06.0-17.el8s.x86_64\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.06.0-29.el8s.x86_64\n  - package
> rdo-ovn-host-2:22.06-3.el8.noarch obsoletes ovn-2021-host < 22.06 provided
> by ovn-2021-host-21.12.0-11.el8s.x86_64\n  - cannot install the best update
> candidate for package ovn-2021-host-21.12.0-82.el8s.x86_64\n  - cannot
> install the best update candidate for package
> ovirt-openvswitch-ovn-host-2.15-4.el8.noarch",
>       "invocation" : {
>         "module_args" : {
>           "name" : [ "*" ],
>           "state" : "latest",
>           "allow_downgrade" : false,
>           "autoremove" : false,
>           "bugfix" : false,
>           "cacheonly" : false,
>           "disable_gpg_check" : false,
>           "disable_plugin" : [ ],
>           "disablerepo" : [ ],
>           "download_only" : false,
>           "enable_plugin" : [ ],
>           "enablerepo" : [ ],
>           "exclude" : [ ],
>           "installroot" : "/",
>           "install_repoquery" : true,
>           "install_weak_deps" : true,
>           "security" : false,
>           "skip_broken" : false,
>           "update_cache" : false,
>           "update_only" : false,
>           "validate_certs" : true,
>           "lock_timeout" : 30,
>           "allowerasing" : false,
>           "nobest" : false,
>           "conf_file" : null,
>           "disable_excludes" : null,
>           "download_dir" : null,
>           "list" : null,
>           "releasever" : null
>         }
>       },
>       "_ansible_no_log" : false,
>       "changed" : false
>     },
>     "start" : "2022-10-17T18:57:19.992530",
>     "end" : "2022-10-17T18:57:22.572294",
>     "duration" : 2.579764,
>     "ignore_errors" : null,
>     "event_loop" : null,
>     "uuid" : "71edb069-f43c-452b-8528-a0c3d6625fd1"
>   }
> }
>
> *David Johnson*
> _______________________________________________
> Users mailing list -- users@ovirt.org
> To unsubscribe send an email to users-le...@ovirt.org
> Privacy Statement: https://www.ovirt.org/privacy-policy.html
> oVirt Code of Conduct: 
> https://www.ovirt.org/community/about/community-guidelines/
> List Archives:
> https://lists.ovirt.org/archives/list/users@ovirt.org/message/EHNPT3GUAKOQVCDZJQYGH655TVJED7SM/
_______________________________________________
Users mailing list -- users@ovirt.org
To unsubscribe send an email to users-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/users@ovirt.org/message/RIHO32QA3NT6YFCL3H63AEEPW7ELTKMU/

Reply via email to