[CQ]: 107820, 1 (ovirt-engine-api-model) failed "ovirt-master" system tests, but isn't the failure root cause

2020-03-30 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
107820,1 (ovirt-engine-api-model) failed. However, this change seems not to be
the root cause for this failure. Change 107547,2 (ovirt-engine-api-model) that
this change depends on or is based on, was detected as the cause of the testing
failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 107547,2 (ovirt-engine-api-
model) is fixed and this change is updated to refer to or rebased on the fixed
version, or this change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107820/1

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/107547/2

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21911/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/Q5UZFXJOQ2EUANZEDEJDS5BZXYDDW4CH/


Build failed in Jenkins: system-sync_mirrors-centos-sclo-rh-release-7.6-el7-x86_64 #480

2020-03-30 Thread jenkins
See 


Changes:


--
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace 

No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
 > git --version # timeout=10
 > git fetch --tags --progress --prune http://gerrit.ovirt.org/jenkins.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e05031190c69f2bb02c8d7fbe206529444ec1f12 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e05031190c69f2bb02c8d7fbe206529444ec1f12
Commit message: "mirrors: drop gluster 5 mirroring"
 > git rev-list --no-walk e05031190c69f2bb02c8d7fbe206529444ec1f12 # timeout=10
[system-sync_mirrors-centos-sclo-rh-release-7.6-el7-x86_64] $ /bin/bash -xe 
/tmp/jenkins4038020657333048096.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror 
centos-sclo-rh-release-7.6-el7 x86_64 jenkins/data/mirrors-reposync.conf
+ MIRRORS_MP_BASE=/var/www/html/repos
+ MIRRORS_HTTP_BASE=http://mirrors.phx.ovirt.org/repos
+ MIRRORS_CACHE=/home/jenkins/mirrors_cache
+ MAX_LOCK_ATTEMPTS=120
+ LOCK_WAIT_INTERVAL=5
+ LOCK_BASE=/home/jenkins
+ OLD_MD_TO_KEEP=100
+ HTTP_SELINUX_TYPE=httpd_sys_content_t
+ HTTP_FILE_MODE=644
+ main resync_yum_mirror centos-sclo-rh-release-7.6-el7 x86_64 
jenkins/data/mirrors-reposync.conf
+ local command=resync_yum_mirror
+ command_args=("${@:2}")
+ local command_args
+ cmd_resync_yum_mirror centos-sclo-rh-release-7.6-el7 x86_64 
jenkins/data/mirrors-reposync.conf
+ local repo_name=centos-sclo-rh-release-7.6-el7
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local sync_needed
+ mkdir -p /home/jenkins/mirrors_cache
+ verify_repo_fs centos-sclo-rh-release-7.6-el7 yum
+ local repo_name=centos-sclo-rh-release-7.6-el7
+ local repo_type=yum
+ sudo install -o jenkins -d /var/www/html/repos/yum 
/var/www/html/repos/yum/centos-sclo-rh-release-7.6-el7 
/var/www/html/repos/yum/centos-sclo-rh-release-7.6-el7/base
+ check_yum_sync_needed centos-sclo-rh-release-7.6-el7 x86_64 
jenkins/data/mirrors-reposync.conf sync_needed
+ local repo_name=centos-sclo-rh-release-7.6-el7
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local p_sync_needed=sync_needed
+ local reposync_out
+ echo 'Checking if mirror needs a resync'
Checking if mirror needs a resync
+ rm -rf /home/jenkins/mirrors_cache/centos-sclo-rh-release-7.6-el7
++ IFS=,
++ echo x86_64
+ for arch in '$(IFS=,; echo $repo_archs)'
++ run_reposync centos-sclo-rh-release-7.6-el7 x86_64 
jenkins/data/mirrors-reposync.conf --urls --quiet
++ local repo_name=centos-sclo-rh-release-7.6-el7
++ local repo_arch=x86_64
++ local reposync_conf=jenkins/data/mirrors-reposync.conf
++ extra_args=("${@:4}")
++ local extra_args
++ reposync --config=jenkins/data/mirrors-reposync.conf 
--repoid=centos-sclo-rh-release-7.6-el7 --arch=x86_64 
--cachedir=/home/jenkins/mirrors_cache 
--download_path=/var/www/html/repos/yum/centos-sclo-rh-release-7.6-el7/base 
--norepopath --newest-only --urls --quiet
Error setting up repositories: Error making cache directory: 
/home/jenkins/mirrors_cache/centos-sclo-rh-release-7.6-el7 error was: [Errno 
17] File exists: '/home/jenkins/mirrors_cache/centos-sclo-rh-release-7.6-el7'
+ reposync_out=
Build step 'Execute shell' marked build as failure
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/4JJ57J5XFNMPEEJWBIKN3WLVF44VHEZJ/


[CQ]: 107819, 1 (ovirt-engine-api-model) failed "ovirt-master" system tests, but isn't the failure root cause

2020-03-30 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
107819,1 (ovirt-engine-api-model) failed. However, this change seems not to be
the root cause for this failure. Change 107547,2 (ovirt-engine-api-model) that
this change depends on or is based on, was detected as the cause of the testing
failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 107547,2 (ovirt-engine-api-
model) is fixed and this change is updated to refer to or rebased on the fixed
version, or this change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107819/1

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/107547/2

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21903/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/R4HHGKOMIB7ILQADAG4LBON3ZISAH3TG/


[CQ]: 107814, 1 (ovirt-engine-metrics) failed "ovirt-master" system tests, but isn't the failure root cause

2020-03-30 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
107814,1 (ovirt-engine-metrics) failed. However, this change seems not to be
the root cause for this failure. Change 107813,1 (ovirt-engine-metrics) that
this change depends on or is based on, was detected as the cause of the testing
failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 107813,1 (ovirt-engine-metrics)
is fixed and this change is updated to refer to or rebased on the fixed
version, or this change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107814/1

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/107813/1

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21895/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/H4OYGCUJTYQ6HZTYP4EEMMBUTDEXDG66/


[CQ]: 107813, 1 (ovirt-engine-metrics) failed "ovirt-master" system tests

2020-03-30 Thread oVirt Jenkins
Change 107813,1 (ovirt-engine-metrics) is probably the reason behind recent
system test failures in the "ovirt-master" change queue and needs to be fixed.

This change had been removed from the testing queue. Artifacts build from this
change will not be released until it is fixed.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107813/1

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21887/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/K3KNLVFQLY7GEXRRZN5H2LFO3O4JHCZY/


[oVirt Jenkins] ovirt-system-tests_compat-4.3-suite-master - Build # 394 - Failure!

2020-03-30 Thread jenkins
Project: 
https://jenkins.ovirt.org/job/ovirt-system-tests_compat-4.3-suite-master/ 
Build: 
https://jenkins.ovirt.org/job/ovirt-system-tests_compat-4.3-suite-master/394/
Build Number: 394
Build Status:  Failure
Triggered By: Started by timer

-
Changes Since Last Success:
-
Changes for Build #394
[Sandro Bonazzola] repos: fix Advanced Virtualization repos

[Sandro Bonazzola] mirrors: drop gluster 5 mirroring




-
Failed Tests:
-
No tests ran.___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/6D3TC4HFXCJLZ6FQFPAPKQFIPBLC3SXD/


[oVirt Jenkins] cleanup-gated-dummy-master - Build #101 - FAILURE!

2020-03-30 Thread jenkins
Build: https://jenkins.ovirt.org/job/cleanup-gated-dummy-master/101/
Build Name: #101
Build Status: FAILURE
Gerrit change: null
- title: null
- project: null
- branch: null
- author: null ___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/NJS3KRL6R7ZCYM7WD7QDRAYEFOARL774/


[CQ]: e80bf08 (ovirt-ansible-hosted-engine-setup) failed "ovirt-master" system tests, but isn't the failure root cause

2020-03-30 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
e80bf08 (ovirt-ansible-hosted-engine-setup) failed. However, this change seems
not to be the root cause for this failure. Change e4c4843 (ovirt-ansible-
hosted-engine-setup) that this change depends on or is based on, was detected
as the cause of the testing failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change e4c4843 (ovirt-ansible-hosted-
engine-setup) is fixed and this change is updated to refer to or rebased on the
fixed version, or this change is modified to no longer depend on it.

For further details about the change see:
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/commit/e80bf08ece24ec7a3b051434825e2f0e8e2f1fd8

For further details about the change that seems to be the root cause behind the
testing failures see:
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/commit/e4c48439011fd453c94588443f8cca74ca859302

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21879/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/4WKPGGWO4K6TYDWZF37NQWXIYAFK4TN7/


oVirt infra daily report - unstable production jobs - 1146

2020-03-30 Thread jenkins
Good morning!

Attached is the HTML page with the jenkins status report. You can see it also 
here:
 - 
https://jenkins.ovirt.org/job/system_jenkins-report/1146//artifact/exported-artifacts/upstream_report.html

Cheers,
Jenkins
 
 
 
 RHEVM CI Jenkins Daily Report - 30/03/2020
 
00 Unstable Critical
 
   
   ovirt-appliance_master_build-artifacts-el8-x86_64
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-guest-agent_4.3_build-artifacts-el7-x86_64
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-guest-agent_4.3_build-artifacts-fc30-x86_64
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-guest-agent_master_build-artifacts-el7-x86_64
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-guest-agent_master_build-artifacts-fc30-x86_64
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-master_change-queue-tester
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_ansible-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_gate
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_hc-basic-suite-4.3
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_hc-basic-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-basic-ipv6-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-basic-iscsi-suite-4.3
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-basic-iscsi-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   

[CQ]: a38cbb8 (ovirt-web-ui) failed "ovirt-master" system tests, but isn't the failure root cause

2020-03-30 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
a38cbb8 (ovirt-web-ui) failed. However, this change seems not to be the root
cause for this failure. Change bcb6072 (ovirt-web-ui) that this change depends
on or is based on, was detected as the cause of the testing failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change bcb6072 (ovirt-web-ui) is fixed
and this change is updated to refer to or rebased on the fixed version, or this
change is modified to no longer depend on it.

For further details about the change see:
https://github.com/oVirt/ovirt-web-ui/commit/a38cbb849ced70f62787779efdf3fffd96b3ebdb

For further details about the change that seems to be the root cause behind the
testing failures see:
https://github.com/oVirt/ovirt-web-ui/commit/bcb6072313ad3969e292a32fe376c4f73698cec1

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21871/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/5HKQAXR4WRWVUSPFGBISIUQQOPURESN3/


[CQ]: 107318, 3 (ovirt-engine-api-model) failed "ovirt-master" system tests, but isn't the failure root cause

2020-03-30 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
107318,3 (ovirt-engine-api-model) failed. However, this change seems not to be
the root cause for this failure. Change 107547,2 (ovirt-engine-api-model) that
this change depends on or is based on, was detected as the cause of the testing
failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 107547,2 (ovirt-engine-api-
model) is fixed and this change is updated to refer to or rebased on the fixed
version, or this change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107318/3

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/107547/2

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21863/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/5EB6IYBRY3V5JLP2LV5COTI7OMKBKBI3/


[CQ]: 107547, 2 (ovirt-engine-api-model) failed "ovirt-master" system tests

2020-03-30 Thread oVirt Jenkins
Change 107547,2 (ovirt-engine-api-model) is probably the reason behind recent
system test failures in the "ovirt-master" change queue and needs to be fixed.

This change had been removed from the testing queue. Artifacts build from this
change will not be released until it is fixed.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107547/2

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21855/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/PGBJ5JDSCG6NDMHWFXRSORW3HNMGLAOH/


Re: OST fails in 002_bootstrap_pytest.py - setup_storage.sh

2020-03-30 Thread Martin Perina
On Mon, Mar 30, 2020 at 5:38 PM Galit Rosenthal  wrote:

> It looks like the local repo stops running.
> When I run curl before the failure just to check the status, I can see it
> isn't accessible.
>
> I'm trying to see where it fails or what cause it to fail.
>
> I manage to reproduce on BM
>

I thought that moving setup_storage will mitigate the issue:
https://gerrit.ovirt.org/#/c/107989/
But it just postponed the error to further phase, now adding host failing
to the same issue: Failed to download metadata for repo 'alocalsync'

https://jenkins.ovirt.org/view/oVirt system
tests/job/ovirt-system-tests_manual/6710

So Galit, please take a look, oVirt CQ is suffering from this issue for
more than a week now

>
> On Mon, Mar 30, 2020 at 6:23 PM Marcin Sobczyk 
> wrote:
>
>> Hi Galit
>>
>> I can see the issue again - now in manual OST runs:
>>
>>
>> https://jenkins.ovirt.org/view/oVirt%20system%20tests/job/ovirt-system-tests_manual/6711/consoleFull#L2,856
>>
>> Regards, Marcin
>>
>> On 3/23/20 10:09 PM, Marcin Sobczyk wrote:
>>
>>
>>
>> On 3/23/20 8:51 PM, Galit Rosenthal wrote:
>>
>> I run it now locally using the extra sources as it runs in the CQ and it
>> didn't fail for me.
>>
>> I will continue to investigate tomorrow,
>>
>> Marcin, did you see this issue also in check_patch or only in CQ?
>>
>> I wasn't aware of the issue till Nir raised it - I was working with the
>> patch previously
>> and both check-patch and manual runs were fine. I think it concerns only
>> CQ then.
>>
>> Regards,
>> Galit
>>
>> On Mon, Mar 23, 2020 at 4:29 PM Galit Rosenthal 
>> wrote:
>>
>>> I will look at it.
>>>
>>> On Mon, Mar 23, 2020 at 4:18 PM Martin Perina 
>>> wrote:
>>>


 On Mon, Mar 23, 2020 at 3:16 PM Marcin Sobczyk 
 wrote:

>
>
> On 3/23/20 3:10 PM, Marcin Sobczyk wrote:
> >
> >
> > On 3/23/20 2:53 PM, Nir Soffer wrote:
> >> On Mon, Mar 23, 2020 at 3:26 PM Marcin Sobczyk 
>
> >> wrote:
> >>>
> >>>
> >>> On 3/23/20 2:17 PM, Nir Soffer wrote:
>  On Mon, Mar 23, 2020 at 1:25 PM Marcin Sobczyk
>   wrote:
> >
> > On 3/21/20 1:18 AM, Nir Soffer wrote:
> >
> > On Fri, Mar 20, 2020 at 9:35 PM Nir Soffer 
> > wrote:
> >> Looks like infrastructure issue setting up storage on engine
> host.
> >>
> >> Here are 2 failing builds with unrelated changes:
> >> https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6677/
> >> https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6678/
> > Rebuilding still fails in setup_storage:
> >
> >
> https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6679/testReport/
> >
> >
> https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6680/testReport/
> >
> >
> >> Is this a known issue?
> >>
> >> Error Message
> >>
> >> AssertionError: setup_storage.sh failed. Exit code is 1 assert
> 1
> >> == 0   -1   +0
> >>
> >> Stacktrace
> >>
> >> prefix = 
> >>
> >>   @pytest.mark.run(order=14)
> >>   def test_configure_storage(prefix):
> >>   engine = prefix.virt_env.engine_vm()
> >>   result = engine.ssh(
> >>   [
> >>   '/tmp/setup_storage.sh',
> >>   ],
> >>   )
> >>> assert result.code == 0, 'setup_storage.sh failed.
> Exit
> >>> code is %s' % result.code
> >> E   AssertionError: setup_storage.sh failed. Exit code is 1
> >> E   assert 1 == 0
> >> E -1
> >> E +0
> >>
> >>
> >> The pytest traceback is nice, but in this case it is does not
> >> show any useful info.
> >>
> >> Since we run a script using ssh, the error message should
> include
> >> the process stdout and stderr
> >> which probably can explain the failure.
> > I posted https://gerrit.ovirt.org/#/c/107830/ to improve
> logging
> > during storage setup.
> > Unfortunately AFAICS it didn't fail, so I guess we'll have to
> > merge it and wait for a failed job to get some helpful logs.
>  Thanks.
> 
>  It still fails for me with current code:
> 
> https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6689/testReport/
> 
> 
>  Same when using current vdsm master.
> >>> Updated the patch according to your suggestions and currently
> trying
> >>> out
> >>> OST for the 4th time -
> >>> all previous runs succeeded. I guess I'm out of luck :)
> >> It succeeds on your local OST setup but fail on Jenkins?
> > No, I mean jenkins - both check-patch runs didn't fail on this
> script.
> > I also tried running OST 

Re: OST fails in 002_bootstrap_pytest.py - setup_storage.sh

2020-03-30 Thread Galit Rosenthal
It looks like the local repo stops running.
When I run curl before the failure just to check the status, I can see it
isn't accessible.

I'm trying to see where it fails or what cause it to fail.

I manage to reproduce on BM

On Mon, Mar 30, 2020 at 6:23 PM Marcin Sobczyk  wrote:

> Hi Galit
>
> I can see the issue again - now in manual OST runs:
>
>
> https://jenkins.ovirt.org/view/oVirt%20system%20tests/job/ovirt-system-tests_manual/6711/consoleFull#L2,856
>
> Regards, Marcin
>
> On 3/23/20 10:09 PM, Marcin Sobczyk wrote:
>
>
>
> On 3/23/20 8:51 PM, Galit Rosenthal wrote:
>
> I run it now locally using the extra sources as it runs in the CQ and it
> didn't fail for me.
>
> I will continue to investigate tomorrow,
>
> Marcin, did you see this issue also in check_patch or only in CQ?
>
> I wasn't aware of the issue till Nir raised it - I was working with the
> patch previously
> and both check-patch and manual runs were fine. I think it concerns only
> CQ then.
>
> Regards,
> Galit
>
> On Mon, Mar 23, 2020 at 4:29 PM Galit Rosenthal 
> wrote:
>
>> I will look at it.
>>
>> On Mon, Mar 23, 2020 at 4:18 PM Martin Perina  wrote:
>>
>>>
>>>
>>> On Mon, Mar 23, 2020 at 3:16 PM Marcin Sobczyk 
>>> wrote:
>>>


 On 3/23/20 3:10 PM, Marcin Sobczyk wrote:
 >
 >
 > On 3/23/20 2:53 PM, Nir Soffer wrote:
 >> On Mon, Mar 23, 2020 at 3:26 PM Marcin Sobczyk 

 >> wrote:
 >>>
 >>>
 >>> On 3/23/20 2:17 PM, Nir Soffer wrote:
  On Mon, Mar 23, 2020 at 1:25 PM Marcin Sobczyk
   wrote:
 >
 > On 3/21/20 1:18 AM, Nir Soffer wrote:
 >
 > On Fri, Mar 20, 2020 at 9:35 PM Nir Soffer 
 > wrote:
 >> Looks like infrastructure issue setting up storage on engine
 host.
 >>
 >> Here are 2 failing builds with unrelated changes:
 >> https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6677/
 >> https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6678/
 > Rebuilding still fails in setup_storage:
 >
 >
 https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6679/testReport/
 >
 >
 https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6680/testReport/
 >
 >
 >> Is this a known issue?
 >>
 >> Error Message
 >>
 >> AssertionError: setup_storage.sh failed. Exit code is 1 assert 1
 >> == 0   -1   +0
 >>
 >> Stacktrace
 >>
 >> prefix = 
 >>
 >>   @pytest.mark.run(order=14)
 >>   def test_configure_storage(prefix):
 >>   engine = prefix.virt_env.engine_vm()
 >>   result = engine.ssh(
 >>   [
 >>   '/tmp/setup_storage.sh',
 >>   ],
 >>   )
 >>> assert result.code == 0, 'setup_storage.sh failed. Exit
 >>> code is %s' % result.code
 >> E   AssertionError: setup_storage.sh failed. Exit code is 1
 >> E   assert 1 == 0
 >> E -1
 >> E +0
 >>
 >>
 >> The pytest traceback is nice, but in this case it is does not
 >> show any useful info.
 >>
 >> Since we run a script using ssh, the error message should
 include
 >> the process stdout and stderr
 >> which probably can explain the failure.
 > I posted https://gerrit.ovirt.org/#/c/107830/ to improve logging
 > during storage setup.
 > Unfortunately AFAICS it didn't fail, so I guess we'll have to
 > merge it and wait for a failed job to get some helpful logs.
  Thanks.
 
  It still fails for me with current code:
 
 https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6689/testReport/
 
 
  Same when using current vdsm master.
 >>> Updated the patch according to your suggestions and currently
 trying
 >>> out
 >>> OST for the 4th time -
 >>> all previous runs succeeded. I guess I'm out of luck :)
 >> It succeeds on your local OST setup but fail on Jenkins?
 > No, I mean jenkins - both check-patch runs didn't fail on this script.
 > I also tried running OST manually twice and same thing happened.
 > Anyway - the patch has been merged now so if any failure occurs in CQ
 > we should know what's going on.
 Ok, finally caught a failure in CQ [1]:

 [2020-03-23T14:14:09.836Z] if result.code != 0:
 [2020-03-23T14:14:09.836Z] msg = (
 [2020-03-23T14:14:09.836Z] 'setup_storage.sh failed
 with
 exit code: {}.\n'
 [2020-03-23T14:14:09.836Z] 'stdout:\n{}'
 [2020-03-23T14:14:09.836Z] 'stderr:\n{}'
 [2020-03-23T14:14:09.836Z] ).format(result.code,
 result.out,
 result.err)
 

Re: OST fails in 002_bootstrap_pytest.py - setup_storage.sh

2020-03-30 Thread Marcin Sobczyk

Hi Galit

I can see the issue again - now in manual OST runs:

https://jenkins.ovirt.org/view/oVirt%20system%20tests/job/ovirt-system-tests_manual/6711/consoleFull#L2,856

Regards, Marcin

On 3/23/20 10:09 PM, Marcin Sobczyk wrote:



On 3/23/20 8:51 PM, Galit Rosenthal wrote:
I run it now locally using the extra sources as it runs in the CQ and 
it didn't fail for me.


I will continue to investigate tomorrow,

Marcin, did you see this issue also in check_patch or only in CQ?
I wasn't aware of the issue till Nir raised it - I was working with 
the patch previously
and both check-patch and manual runs were fine. I think it concerns 
only CQ then.



Regards,
Galit

On Mon, Mar 23, 2020 at 4:29 PM Galit Rosenthal > wrote:


I will look at it.

On Mon, Mar 23, 2020 at 4:18 PM Martin Perina mailto:mper...@redhat.com>> wrote:



On Mon, Mar 23, 2020 at 3:16 PM Marcin Sobczyk
mailto:msobc...@redhat.com>> wrote:



On 3/23/20 3:10 PM, Marcin Sobczyk wrote:
>
>
> On 3/23/20 2:53 PM, Nir Soffer wrote:
>> On Mon, Mar 23, 2020 at 3:26 PM Marcin Sobczyk
mailto:msobc...@redhat.com>>
>> wrote:
>>>
>>>
>>> On 3/23/20 2:17 PM, Nir Soffer wrote:
 On Mon, Mar 23, 2020 at 1:25 PM Marcin Sobczyk
 mailto:msobc...@redhat.com>>
wrote:
>
> On 3/21/20 1:18 AM, Nir Soffer wrote:
>
> On Fri, Mar 20, 2020 at 9:35 PM Nir Soffer
mailto:nsof...@redhat.com>>
> wrote:
>> Looks like infrastructure issue setting up storage
on engine host.
>>
>> Here are 2 failing builds with unrelated changes:
>>
https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6677/
>>
https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6678/
> Rebuilding still fails in setup_storage:
>
>

https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6679/testReport/

>
>

https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6680/testReport/

>
>
>> Is this a known issue?
>>
>> Error Message
>>
>> AssertionError: setup_storage.sh failed. Exit code
is 1 assert 1
>> == 0   -1   +0
>>
>> Stacktrace
>>
>> prefix = 
>>
>> @pytest.mark.run(order=14)
>>   def test_configure_storage(prefix):
>>   engine = prefix.virt_env.engine_vm()
>>   result = engine.ssh(
>>   [
>> '/tmp/setup_storage.sh',
>>   ],
>>   )
>>>     assert result.code == 0,
'setup_storage.sh failed. Exit
>>> code is %s' % result.code
>> E   AssertionError: setup_storage.sh failed.
Exit code is 1
>> E   assert 1 == 0
>> E -1
>> E +0
>>
>>
>> The pytest traceback is nice, but in this case it
is does not
>> show any useful info.
>>
>> Since we run a script using ssh, the error message
should include
>> the process stdout and stderr
>> which probably can explain the failure.
> I posted https://gerrit.ovirt.org/#/c/107830/ to
improve logging
> during storage setup.
> Unfortunately AFAICS it didn't fail, so I guess
we'll have to
> merge it and wait for a failed job to get some
helpful logs.
 Thanks.

 It still fails for me with current code:


https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6689/testReport/



 Same when using current vdsm master.
>>> Updated the patch according to your suggestions and
currently trying
>>> out
>>> OST for the 4th time -
>>> all previous runs succeeded. I guess I'm out of luck :)
>> It succeeds on your local OST setup but fail on Jenkins?
> No, I mean jenkins - both check-patch runs didn't fail
on this script.
> I also tried running OST manually twice and same thing
happened.
> Anyway - the patch has been merged now 

[CQ]: 107801, 1 (ovirt-engine-nodejs-modules) failed "ovirt-master" system tests, but isn't the failure root cause

2020-03-30 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
107801,1 (ovirt-engine-nodejs-modules) failed. However, this change seems not
to be the root cause for this failure. Change 107796,3 (ovirt-engine-nodejs-
modules) that this change depends on or is based on, was detected as the cause
of the testing failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 107796,3 (ovirt-engine-nodejs-
modules) is fixed and this change is updated to refer to or rebased on the
fixed version, or this change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107801/1

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/107796/3

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21847/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/2MLM2ITVOVRKUBCTEUKU4EJFF23CYH5D/


[CQ]: 107796, 3 (ovirt-engine-nodejs-modules) failed "ovirt-master" system tests

2020-03-30 Thread oVirt Jenkins
Change 107796,3 (ovirt-engine-nodejs-modules) is probably the reason behind
recent system test failures in the "ovirt-master" change queue and needs to be
fixed.

This change had been removed from the testing queue. Artifacts build from this
change will not be released until it is fixed.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107796/3

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21839/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/62C2BZJ6CHQJPQMYLQBPBEIMNRRPU4SH/


[JIRA] (OVIRT-2889) OST fails for unreachable repo 'alocalsync'

2020-03-30 Thread Amit Bawer (oVirt JIRA)
Amit Bawer created OVIRT-2889:
-

 Summary: OST fails for unreachable repo 'alocalsync'
 Key: OVIRT-2889
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2889
 Project: oVirt - virtualization made easy
  Issue Type: By-EMAIL
Reporter: Amit Bawer
Assignee: infra


Hi

OST fails for unreachable repo 'alocalsync',

Seen at
https://jenkins.ovirt.org/job/ovirt-system-tests_manual/6706/testReport/basic-suite-master.test-scenarios/002_bootstrap_pytest/test_configure_storage/


Thanks



--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100122)
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/VX7W6XPPUXIQS62PXQUIRBGLPUGQWDEZ/


[CQ]: 107824,1 (imgbased) failed "ovirt-4.3" system tests

2020-03-30 Thread oVirt Jenkins
Change 107824,1 (imgbased) is probably the reason behind recent system test
failures in the "ovirt-4.3" change queue and needs to be fixed.

This change had been removed from the testing queue. Artifacts build from this
change will not be released until it is fixed.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107824/1

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-4.3_change-queue-tester/3190/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/7ID6AM2VMESS77AMKXBNXXXHYTIM37GR/


[CQ]: 107773, 3 (ovirt-system-tests) failed "ovirt-master" system tests, but isn't the failure root cause

2020-03-30 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
107773,3 (ovirt-system-tests) failed. However, this change seems not to be the
root cause for this failure. Change 107439,4 (ovirt-system-tests) that this
change depends on or is based on, was detected as the cause of the testing
failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 107439,4 (ovirt-system-tests)
is fixed and this change is updated to refer to or rebased on the fixed
version, or this change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/107773/3

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/107439/4

For failed test results see:
https://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/21831/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/PKWOCRQRV2LCK6524DFFC2GRMWI55AJA/


basic suite: Error: Failed to download metadata for repo 'alocalsync'

2020-03-30 Thread Yedidyah Bar David
Hi all,

I get $subject in [1][2]. Any idea?

Thanks,

[1] 
https://jenkins.ovirt.org/view/oVirt%20system%20tests/job/ovirt-system-tests_manual/6707/
[2] 
https://jenkins.ovirt.org/view/oVirt%20system%20tests/job/ovirt-system-tests_manual/6707/artifact/exported-artifacts/mock_logs/script/stdout_stderr.log
-- 
Didi
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/privacy-policy.html
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/POALKSROLFLRITLLUSZMX4HF2UDXIFEE/