[JIRA] (OVIRT-2829) gmail can't verify emails sent from ovirt.org

2019-11-12 Thread Anton Marchukov (oVirt JIRA)

 [ 
https://ovirt-jira.atlassian.net/browse/OVIRT-2829?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anton Marchukov reassigned OVIRT-2829:
--

Assignee: Evgheni Dereveanchin  (was: infra)

> gmail can't verify emails sent from ovirt.org
> -
>
> Key: OVIRT-2829
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2829
> Project: oVirt - virtualization made easy
>  Issue Type: By-EMAIL
>Reporter: Emil Natan
>Assignee: Evgheni Dereveanchin
>
> It seems like something that started in the last few days, I see Gmail
> marks email sent from gerrit.ovirt.org with "?" saying it could not verify
> the emails are sent from ovirt.org domain. It could be related to bad SPF
> record.
> From the email header: Received-SPF: softfail (google.com: domain of
> transitioning ger...@ovirt.org does not designate 66.187.233.88 as
> permitted sender) client-ip=66.187.233.88;
> -- 
> Emil Natan
> RHV/CNV DevOps



--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100114)
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/XZS6MMMBOZXY7MGJ324TFJ42ZHGBU7F6/


Build failed in Jenkins: system-sync_mirrors-epel-el7-s390x-x86_64 #22

2019-11-12 Thread jenkins
See 


Changes:


--
[...truncated 45.57 KB...]
(1/160): python-qpid-proto 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python-rosdep-doc-0.16.1-2.el7 FAILED  
(1/160): python-rosdep-doc 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python-rosdistro-doc-0.7.5-1.e FAILED  
(1/160): python-rosdistro- 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-acme-0.39.0-1.el7.noar FAILED  
(1/160): python2-acme-0.39 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-apprise-0.8.1-1.el7.no FAILED  
(1/160): python2-apprise-0 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-argh-0.26.1-6.el7.noar FAILED  
(1/160): python2-argh-0.26 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-0.39.0-1.el7.n FAILED  
(1/160): python2-certbot-0 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-apache-0.39.0- FAILED  
(1/160): python2-certbot-a 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-cloudflare FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-cloudxns-0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-digitaloce FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-dnsimple-0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-dnsmadeeas FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-gehirn-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-google-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-linode-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-luadns-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-nsone-0.39 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-ovh-0.39.0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-rfc2136-0. FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-route53-0. FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-sakuraclou FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-nginx-0.39.0-1 FAILED  
(1/160): python2-certbot-n 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-copr-1.98-1.el7.noarch FAILED  
(1/160): python2-copr-1.98 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dictdiffer-0.7.1-8.el7 FAILED  
(1/160): python2-dictdiffe 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon-3.3.4-2.el FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+easyname-3 FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+gratisdns- FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+henet-3.3. FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+plesk-3.3. FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+route53-3. FAILED

[JIRA] (OVIRT-2829) gmail can't verify emails sent from ovirt.org

2019-11-12 Thread Emil Natan (oVirt JIRA)
Emil Natan created OVIRT-2829:
-

 Summary: gmail can't verify emails sent from ovirt.org
 Key: OVIRT-2829
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2829
 Project: oVirt - virtualization made easy
  Issue Type: By-EMAIL
Reporter: Emil Natan
Assignee: infra


It seems like something that started in the last few days, I see Gmail
marks email sent from gerrit.ovirt.org with "?" saying it could not verify
the emails are sent from ovirt.org domain. It could be related to bad SPF
record.
From the email header: Received-SPF: softfail (google.com: domain of
transitioning ger...@ovirt.org does not designate 66.187.233.88 as
permitted sender) client-ip=66.187.233.88;

-- 
Emil Natan
RHV/CNV DevOps



--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100114)
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/HO6TCXB5Y4U43R6ABVXXUULCF6OBHT5S/


[CQ]: 104579,2 (ovirt-release) failed "ovirt-master" system tests

2019-11-12 Thread oVirt Jenkins
Change 104579,2 (ovirt-release) is probably the reason behind recent system
test failures in the "ovirt-master" change queue and needs to be fixed.

This change had been removed from the testing queue. Artifacts build from this
change will not be released until it is fixed.

For further details about the change see:
https://gerrit.ovirt.org/#/c/104579/2

For failed test results see:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/16873/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/U6ACW6QGPUUC7CAFNAVDHFEODXN2JABD/


[CQ]: 104196, 6 (ovirt-engine-api-model) failed "ovirt-master" system tests

2019-11-12 Thread oVirt Jenkins
Change 104196,6 (ovirt-engine-api-model) is probably the reason behind recent
system test failures in the "ovirt-master" change queue and needs to be fixed.

This change had been removed from the testing queue. Artifacts build from this
change will not be released until it is fixed.

For further details about the change see:
https://gerrit.ovirt.org/#/c/104196/6

For failed test results see:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/16868/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/VT4SZ5AE5IJOHEDPVSHMYTHXGAHAZBAT/


Jenkins build is back to normal : system-sync_mirrors-fedora-updates-fc29-x86_64 #1106

2019-11-12 Thread jenkins
See 

___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/Y32FU6G7SQJJ5JOD2UOZPVU3QMO3UCWI/


Build failed in Jenkins: system-sync_mirrors-epel-el7-s390x-x86_64 #21

2019-11-12 Thread jenkins
See 


Changes:

[Barak Korren] pipeline-loader: Improve change detection


--
[...truncated 45.41 KB...]
(1/160): python-qpid-proto 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python-rosdep-doc-0.16.1-2.el7 FAILED  
(1/160): python-rosdep-doc 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python-rosdistro-doc-0.7.5-1.e FAILED  
(1/160): python-rosdistro- 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-acme-0.39.0-1.el7.noar FAILED  
(1/160): python2-acme-0.39 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-apprise-0.8.1-1.el7.no FAILED  
(1/160): python2-apprise-0 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-argh-0.26.1-6.el7.noar FAILED  
(1/160): python2-argh-0.26 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-0.39.0-1.el7.n FAILED  
(1/160): python2-certbot-0 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-apache-0.39.0- FAILED  
(1/160): python2-certbot-a 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-cloudflare FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-cloudxns-0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-digitaloce FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-dnsimple-0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-dnsmadeeas FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-gehirn-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-google-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-linode-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-luadns-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-nsone-0.39 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-ovh-0.39.0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-rfc2136-0. FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-route53-0. FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-sakuraclou FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-nginx-0.39.0-1 FAILED  
(1/160): python2-certbot-n 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-copr-1.98-1.el7.noarch FAILED  
(1/160): python2-copr-1.98 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dictdiffer-0.7.1-8.el7 FAILED  
(1/160): python2-dictdiffe 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon-3.3.4-2.el FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+easyname-3 FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+gratisdns- FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+henet-3.3. FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+plesk-3.3. FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- 

oVirt infra daily report - unstable production jobs - 1007

2019-11-12 Thread jenkins
Good morning!

Attached is the HTML page with the jenkins status report. You can see it also 
here:
 - 
http://jenkins.ovirt.org/job/system_jenkins-report/1007//artifact/exported-artifacts/upstream_report.html

Cheers,
Jenkins
 
 
 
 RHEVM CI Jenkins Daily Report - 12/11/2019
 
00 Unstable Critical
 
   
   changequeue-status_master_standard-poll-upstream-sources
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   changequeue-status_standard-check-patch
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-master_change-queue-tester
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_ansible-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_gate
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_hc-basic-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-basic-ipv6-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-basic-iscsi-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-basic-role-remote-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-basic-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-node-ng-suite-4.3
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_he-node-ng-suite-master
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   ovirt-system-tests_master_standard-poll-upstream-sources
   
   This job is automatically updated by jenkins job builder, any manual
change will be lost in the next update. If you want to make permanent
changes, check out the 
jenkins repo.

   
   
   
   

[CQ]: 104572, 2 (imgbased) failed "ovirt-master" system tests, but isn't the failure root cause

2019-11-12 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
104572,2 (imgbased) failed. However, this change seems not to be the root cause
for this failure. Change 104533,2 (imgbased) that this change depends on or is
based on, was detected as the cause of the testing failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 104533,2 (imgbased) is fixed
and this change is updated to refer to or rebased on the fixed version, or this
change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/104572/2

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/104533/2

For failed test results see:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/16863/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/A5F2TWRQB3NJMPJS47DHMHG2UHP4FFTQ/


[JIRA] (OVIRT-2828) Re: [ovirt-devel] Re: Check patch failure in vdsm

2019-11-12 Thread Nir Soffer (oVirt JIRA)

[ 
https://ovirt-jira.atlassian.net/browse/OVIRT-2828?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=39955#comment-39955
 ] 

Nir Soffer commented on OVIRT-2828:
---

On Tue, Nov 12, 2019 at 7:30 PM Nir Soffer  wrote:
>
> On Tue, Nov 12, 2019 at 5:34 PM Miguel Duarte de Mora Barroso
>  wrote:
> >
> > On Mon, Nov 11, 2019 at 11:12 AM Eyal Shenitzky  wrote:
> > >
> > > Hi,
> > >
> > > I encounter the following error for py36 test in vdsm check patch:
> > >
> > > ...
> > > ...
> > > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/callers.py", 
> > > line 187, in _multicall
> > > 14:03:37 res = hook_impl.function(*args)
> > > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/manager.py", 
> > > line 86, in 
> > > 14:03:37 firstresult=hook.spec.opts.get("firstresult") if hook.spec else 
> > > False,
> > > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/manager.py", 
> > > line 92, in _hookexec
> > > 14:03:37 return self._inner_hookexec(hook, methods, kwargs)
> > > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/hooks.py", 
> > > line 286, in __call__
> > > 14:03:37 return self._hookexec(self, self.get_hookimpls(), kwargs)
> > > 14:03:37 File 
> > > "/home/jenkins/workspace/vdsm_standard-check-patch/vdsm/.tox/storage-py37/lib/python3.7/site-packages/_pytest/config/__init__.py",
> > >  line 82, in main
> > > 14:03:37 return config.hook.pytest_cmdline_main(config=config)
> > > 14:03:37 File 
> > > "/home/jenkins/workspace/vdsm_standard-check-patch/vdsm/.tox/storage-py37/bin/pytest",
> > >  line 8, in 
> > > 14:03:37 sys.exit(main())
> > > 14:03:37 [Inferior 1 (process 22145) detached]
> > > 14:03:37 =
> > > 14:03:37 = Terminating watched process =
> > > 14:03:37 =
> > > 14:03:37 PROFILE {"command": ["python", "py-watch", "600", "pytest", 
> > > "-m", "not (slow or stress)", "--durations=10", "--cov=vdsm.storage", 
> > > "--cov-report=html:htmlcov-storage-py37", "--cov-fail-under=62", 
> > > "storage"], "cpu": 39.921942808919184, "elapsed": 604.4699757099152, 
> > > "idrss": 0, "inblock": 1693453, "isrss": 0, "ixrss": 0, "majflt": 2, 
> > > "maxrss": 331172, "minflt": 5606489, "msgrcv": 0, "msgsnd": 0, "name": 
> > > "storage-py37", "nivcsw": 139819, "nsignals": 0, "nswap": 0, "nvcsw": 
> > > 187576, "oublock": 2495645, "start": 1573386812.7961884, "status": 143, 
> > > "stime": 118.260961, "utime": 123.055197}
> > > 14:03:37 ERROR: InvocationError for command 
> > > /home/jenkins/workspace/vdsm_standard-check-patch/vdsm/.tox/storage-py37/bin/python
> > >  profile storage-py37 python py-watch 600 pytest -m 'not (slow or 
> > > stress)' --durations=10 --cov=vdsm.storage 
> > > --cov-report=html:htmlcov-storage-py37 --cov-fail-under=62 storage 
> > > (exited with code 143)
> > >
> > >
> > > Is there any known issue?
> >
> > Anyone able to pitch in ? I think something similar is happening in
> > [0], also on check-patch [1].
> >
> > [0] - 
> > https://jenkins.ovirt.org/blue/organizations/jenkins/vdsm_standard-check-patch/detail/vdsm_standard-check-patch/14234/
>
> Yes it looks the same issue.
>
> > [1] - https://gerrit.ovirt.org/#/c/104274/
>
> Jenkins slaves are very slow recently. I suspect we run too many jobs
> concurrently or using
> too many virtual cpus.

I hope this will avoid the random failures:
https://gerrit.ovirt.org/c/104629/

> Re: [ovirt-devel] Re: Check patch failure in vdsm
> -
>
> Key: OVIRT-2828
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2828
> Project: oVirt - virtualization made easy
>  Issue Type: By-EMAIL
>Reporter: Nir Soffer
>Assignee: infra
>
> On Tue, Nov 12, 2019 at 5:34 PM Miguel Duarte de Mora Barroso
>  wrote:
> >
> > On Mon, Nov 11, 2019 at 11:12 AM Eyal Shenitzky  wrote:
> > >
> > > Hi,
> > >
> > > I encounter the following error for py36 test in vdsm check patch:
> > >
> > > ...
> > > ...
> > > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/callers.py", 
> > > line 187, in _multicall
> > > 14:03:37 res = hook_impl.function(*args)
> > > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/manager.py", 
> > > line 86, in 
> > > 14:03:37 firstresult=hook.spec.opts.get("firstresult") if hook.spec else 
> > > False,
> > > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/manager.py", 
> > > line 92, in _hookexec
> > > 14:03:37 return self._inner_hookexec(hook, methods, kwargs)
> > > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/hooks.py", 
> > > line 286, in __call__
> > > 14:03:37 return self._hookexec(self, self.get_hookimpls(), kwargs)
> > > 14:03:37 File 
> > > 

[JIRA] (OVIRT-2828) Re: [ovirt-devel] Re: Check patch failure in vdsm

2019-11-12 Thread Nir Soffer (oVirt JIRA)
Nir Soffer created OVIRT-2828:
-

 Summary: Re: [ovirt-devel] Re: Check patch failure in vdsm
 Key: OVIRT-2828
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2828
 Project: oVirt - virtualization made easy
  Issue Type: By-EMAIL
Reporter: Nir Soffer
Assignee: infra


On Tue, Nov 12, 2019 at 5:34 PM Miguel Duarte de Mora Barroso
 wrote:
>
> On Mon, Nov 11, 2019 at 11:12 AM Eyal Shenitzky  wrote:
> >
> > Hi,
> >
> > I encounter the following error for py36 test in vdsm check patch:
> >
> > ...
> > ...
> > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/callers.py", 
> > line 187, in _multicall
> > 14:03:37 res = hook_impl.function(*args)
> > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/manager.py", 
> > line 86, in 
> > 14:03:37 firstresult=hook.spec.opts.get("firstresult") if hook.spec else 
> > False,
> > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/manager.py", 
> > line 92, in _hookexec
> > 14:03:37 return self._inner_hookexec(hook, methods, kwargs)
> > 14:03:37 File "/usr/local/lib/python3.7/site-packages/pluggy/hooks.py", 
> > line 286, in __call__
> > 14:03:37 return self._hookexec(self, self.get_hookimpls(), kwargs)
> > 14:03:37 File 
> > "/home/jenkins/workspace/vdsm_standard-check-patch/vdsm/.tox/storage-py37/lib/python3.7/site-packages/_pytest/config/__init__.py",
> >  line 82, in main
> > 14:03:37 return config.hook.pytest_cmdline_main(config=config)
> > 14:03:37 File 
> > "/home/jenkins/workspace/vdsm_standard-check-patch/vdsm/.tox/storage-py37/bin/pytest",
> >  line 8, in 
> > 14:03:37 sys.exit(main())
> > 14:03:37 [Inferior 1 (process 22145) detached]
> > 14:03:37 =
> > 14:03:37 = Terminating watched process =
> > 14:03:37 =
> > 14:03:37 PROFILE {"command": ["python", "py-watch", "600", "pytest", "-m", 
> > "not (slow or stress)", "--durations=10", "--cov=vdsm.storage", 
> > "--cov-report=html:htmlcov-storage-py37", "--cov-fail-under=62", 
> > "storage"], "cpu": 39.921942808919184, "elapsed": 604.4699757099152, 
> > "idrss": 0, "inblock": 1693453, "isrss": 0, "ixrss": 0, "majflt": 2, 
> > "maxrss": 331172, "minflt": 5606489, "msgrcv": 0, "msgsnd": 0, "name": 
> > "storage-py37", "nivcsw": 139819, "nsignals": 0, "nswap": 0, "nvcsw": 
> > 187576, "oublock": 2495645, "start": 1573386812.7961884, "status": 143, 
> > "stime": 118.260961, "utime": 123.055197}
> > 14:03:37 ERROR: InvocationError for command 
> > /home/jenkins/workspace/vdsm_standard-check-patch/vdsm/.tox/storage-py37/bin/python
> >  profile storage-py37 python py-watch 600 pytest -m 'not (slow or stress)' 
> > --durations=10 --cov=vdsm.storage --cov-report=html:htmlcov-storage-py37 
> > --cov-fail-under=62 storage (exited with code 143)
> >
> >
> > Is there any known issue?
>
> Anyone able to pitch in ? I think something similar is happening in
> [0], also on check-patch [1].
>
> [0] - 
> https://jenkins.ovirt.org/blue/organizations/jenkins/vdsm_standard-check-patch/detail/vdsm_standard-check-patch/14234/

Yes it looks the same issue.

> [1] - https://gerrit.ovirt.org/#/c/104274/

Jenkins slaves are very slow recently. I suspect we run too many jobs
concurrently or using
too many virtual cpus.



--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100114)
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/PT52ZBK7VHVJPSFHJ7YSORYPPESI5DPS/


Build failed in Jenkins: system-sync_mirrors-fedora-updates-fc29-x86_64 #1105

2019-11-12 Thread jenkins
See 


Changes:


--
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on mirrors.phx.ovirt.org (mirrors) in workspace 

No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url http://gerrit.ovirt.org/jenkins.git # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
Pruning obsolete local branches
Fetching upstream changes from http://gerrit.ovirt.org/jenkins.git
 > git --version # timeout=10
 > git fetch --tags --progress --prune http://gerrit.ovirt.org/jenkins.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision edf61c5ec106227d6342b6e12e43f5d437a3727c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f edf61c5ec106227d6342b6e12e43f5d437a3727c
Commit message: "pipeline-loader: Allocate containers from code"
 > git rev-list --no-walk edf61c5ec106227d6342b6e12e43f5d437a3727c # timeout=10
[system-sync_mirrors-fedora-updates-fc29-x86_64] $ /bin/bash -xe 
/tmp/jenkins1059516635071455121.sh
+ jenkins/scripts/mirror_mgr.sh resync_yum_mirror fedora-updates-fc29 x86_64 
jenkins/data/mirrors-reposync.conf
+ MIRRORS_MP_BASE=/var/www/html/repos
+ MIRRORS_HTTP_BASE=http://mirrors.phx.ovirt.org/repos
+ MIRRORS_CACHE=/home/jenkins/mirrors_cache
+ MAX_LOCK_ATTEMPTS=120
+ LOCK_WAIT_INTERVAL=5
+ LOCK_BASE=/home/jenkins
+ OLD_MD_TO_KEEP=100
+ HTTP_SELINUX_TYPE=httpd_sys_content_t
+ HTTP_FILE_MODE=644
+ main resync_yum_mirror fedora-updates-fc29 x86_64 
jenkins/data/mirrors-reposync.conf
+ local command=resync_yum_mirror
+ command_args=("${@:2}")
+ local command_args
+ cmd_resync_yum_mirror fedora-updates-fc29 x86_64 
jenkins/data/mirrors-reposync.conf
+ local repo_name=fedora-updates-fc29
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local sync_needed
+ mkdir -p /home/jenkins/mirrors_cache
+ verify_repo_fs fedora-updates-fc29 yum
+ local repo_name=fedora-updates-fc29
+ local repo_type=yum
+ sudo install -o jenkins -d /var/www/html/repos/yum 
/var/www/html/repos/yum/fedora-updates-fc29 
/var/www/html/repos/yum/fedora-updates-fc29/base
+ check_yum_sync_needed fedora-updates-fc29 x86_64 
jenkins/data/mirrors-reposync.conf sync_needed
+ local repo_name=fedora-updates-fc29
+ local repo_archs=x86_64
+ local reposync_conf=jenkins/data/mirrors-reposync.conf
+ local p_sync_needed=sync_needed
+ local reposync_out
+ echo 'Checking if mirror needs a resync'
Checking if mirror needs a resync
+ rm -rf /home/jenkins/mirrors_cache/fedora-updates-fc29
++ IFS=,
++ echo x86_64
+ for arch in '$(IFS=,; echo $repo_archs)'
++ run_reposync fedora-updates-fc29 x86_64 jenkins/data/mirrors-reposync.conf 
--urls --quiet
++ local repo_name=fedora-updates-fc29
++ local repo_arch=x86_64
++ local reposync_conf=jenkins/data/mirrors-reposync.conf
++ extra_args=("${@:4}")
++ local extra_args
++ reposync --config=jenkins/data/mirrors-reposync.conf 
--repoid=fedora-updates-fc29 --arch=x86_64 
--cachedir=/home/jenkins/mirrors_cache 
--download_path=/var/www/html/repos/yum/fedora-updates-fc29/base --norepopath 
--newest-only --urls --quiet
Error setting up repositories: Cannot retrieve metalink for repository: 
fedora-updates-fc29. Please verify its path and try again
+ reposync_out=
Build step 'Execute shell' marked build as failure
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/FOEKWDY7SNOUXOEB2IQKASXZKGXONUNQ/


[JIRA] (OVIRT-2827) Building vdsm with standard-manual-runner fails to clone the repo

2019-11-12 Thread Nir Soffer (oVirt JIRA)
Nir Soffer created OVIRT-2827:
-

 Summary: Building vdsm with standard-manual-runner fails to clone 
the repo
 Key: OVIRT-2827
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2827
 Project: oVirt - virtualization made easy
  Issue Type: By-EMAIL
Reporter: Nir Soffer
Assignee: infra


Building vdsm using standard-manual-runner fails now with:

ERROR: Error cloning remote repo 'origin'

hudson.plugins.git.GitException: Command "git fetch --tags --progress
https://gerrit.ovirt.org/jenkins +refs/heads/*:refs/remotes/origin/*"
returned status code 128:
stdout:
stderr: error: RPC failed; result=22, HTTP code = 503
fatal: The remote end hung up unexpectedly

https://jenkins.ovirt.org/blue/organizations/jenkins/standard-manual-runner/detail/standard-manual-runner/830/pipeline
https://jenkins.ovirt.org/blue/organizations/jenkins/standard-manual-runner/detail/standard-manual-runner/829/pipeline

Can someone look into this?



--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100114)
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/6QYQGT4WK6OFMAGZNO7KQ6MWNUA7RYFK/


[CQ]: 104490, 4 (vdsm) failed "ovirt-master" system tests, but isn't the failure root cause

2019-11-12 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
104490,4 (vdsm) failed. However, this change seems not to be the root cause for
this failure. Change 104040,10 (vdsm) that this change depends on or is based
on, was detected as the cause of the testing failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 104040,10 (vdsm) is fixed and
this change is updated to refer to or rebased on the fixed version, or this
change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/104490/4

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/104040/10

For failed test results see:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/16858/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/6PFNDKANIUNPIYUSP35EWOWQL5EJLGK6/


Build failed in Jenkins: system-sync_mirrors-epel-el7-s390x-x86_64 #20

2019-11-12 Thread jenkins
See 


Changes:


--
[...truncated 45.41 KB...]
(1/160): python-qpid-proto 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python-rosdep-doc-0.16.1-2.el7 FAILED  
(1/160): python-rosdep-doc 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python-rosdistro-doc-0.7.5-1.e FAILED  
(1/160): python-rosdistro- 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-acme-0.39.0-1.el7.noar FAILED  
(1/160): python2-acme-0.39 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-apprise-0.8.1-1.el7.no FAILED  
(1/160): python2-apprise-0 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-argh-0.26.1-6.el7.noar FAILED  
(1/160): python2-argh-0.26 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-0.39.0-1.el7.n FAILED  
(1/160): python2-certbot-0 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-apache-0.39.0- FAILED  
(1/160): python2-certbot-a 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-cloudflare FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-cloudxns-0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-digitaloce FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-dnsimple-0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-dnsmadeeas FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-gehirn-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-google-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-linode-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-luadns-0.3 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-nsone-0.39 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-ovh-0.39.0 FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-rfc2136-0. FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-route53-0. FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-dns-sakuraclou FAILED  
(1/160): python2-certbot-d 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-certbot-nginx-0.39.0-1 FAILED  
(1/160): python2-certbot-n 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-copr-1.98-1.el7.noarch FAILED  
(1/160): python2-copr-1.98 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dictdiffer-0.7.1-8.el7 FAILED  
(1/160): python2-dictdiffe 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon-3.3.4-2.el FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+easyname-3 FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+gratisdns- FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+henet-3.3. FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+plesk-3.3. FAILED  
(1/160): python2-dns-lexic 0% [ ]  0.0 B/s |0 B   --:-- ETA 
python2-dns-lexicon+route53-3. FAILED

[CQ]: 104491, 4 (vdsm) failed "ovirt-master" system tests, but isn't the failure root cause

2019-11-12 Thread oVirt Jenkins
A system test invoked by the "ovirt-master" change queue including change
104491,4 (vdsm) failed. However, this change seems not to be the root cause for
this failure. Change 104040,10 (vdsm) that this change depends on or is based
on, was detected as the cause of the testing failures.

This change had been removed from the testing queue. Artifacts built from this
change will not be released until either change 104040,10 (vdsm) is fixed and
this change is updated to refer to or rebased on the fixed version, or this
change is modified to no longer depend on it.

For further details about the change see:
https://gerrit.ovirt.org/#/c/104491/4

For further details about the change that seems to be the root cause behind the
testing failures see:
https://gerrit.ovirt.org/#/c/104040/10

For failed test results see:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/16855/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/XLNNWHIVCQF4AD4ZP22TO2HFOFVLKBSE/


[JIRA] (OVIRT-2823) Workspace deleted while job was running

2019-11-12 Thread Anton Marchukov (oVirt JIRA)

 [ 
https://ovirt-jira.atlassian.net/browse/OVIRT-2823?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anton Marchukov reassigned OVIRT-2823:
--

Assignee: Evgheni Dereveanchin  (was: infra)

> Workspace deleted while job was running
> ---
>
> Key: OVIRT-2823
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2823
> Project: oVirt - virtualization made easy
>  Issue Type: By-EMAIL
>Reporter: Sandro Bonazzola
>Assignee: Evgheni Dereveanchin
>
> missing workspace
> /home/jenkins/workspace/ovirt-release_standard-check-patch/ovirt-release on
> vm0013.workers-phx.ovirt.org
> caused job to fail at
> https://jenkins.ovirt.org/blue/organizations/jenkins/ovirt-release_standard-check-patch/detail/ovirt-release_standard-check-patch/199/pipeline
> There's something deleting workspaces while they are in use.
> Please investigate.
> -- 
> Sandro Bonazzola



--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100114)
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/VSZYM3J34RLZ5CD63ITUVO3CSPVVXHDU/


[JIRA] (OVIRT-2823) Workspace deleted while job was running

2019-11-12 Thread Anton Marchukov (oVirt JIRA)

[ 
https://ovirt-jira.atlassian.net/browse/OVIRT-2823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=39954#comment-39954
 ] 

Anton Marchukov commented on OVIRT-2823:


there is jenkins job system-cleanuo-workspaces, might be related to it. We need 
to make sure tis job is removed.

> Workspace deleted while job was running
> ---
>
> Key: OVIRT-2823
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2823
> Project: oVirt - virtualization made easy
>  Issue Type: By-EMAIL
>Reporter: Sandro Bonazzola
>Assignee: infra
>
> missing workspace
> /home/jenkins/workspace/ovirt-release_standard-check-patch/ovirt-release on
> vm0013.workers-phx.ovirt.org
> caused job to fail at
> https://jenkins.ovirt.org/blue/organizations/jenkins/ovirt-release_standard-check-patch/detail/ovirt-release_standard-check-patch/199/pipeline
> There's something deleting workspaces while they are in use.
> Please investigate.
> -- 
> Sandro Bonazzola



--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100114)
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/DV7GPEARY7YUII3LPVWC2LMGKYW426ZJ/


[JIRA] (OVIRT-2823) Workspace deleted while job was running

2019-11-12 Thread Anton Marchukov (oVirt JIRA)

[ 
https://ovirt-jira.atlassian.net/browse/OVIRT-2823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=39954#comment-39954
 ] 

Anton Marchukov edited comment on OVIRT-2823 at 11/12/19 12:25 PM:
---

there is jenkins job system-cleanuo-workspaces, might be related to it. We need 
to make sure this job is disabled and then removed.


was (Author: amarchuk):
there is jenkins job system-cleanuo-workspaces, might be related to it. We need 
to make sure tis job is removed.

> Workspace deleted while job was running
> ---
>
> Key: OVIRT-2823
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-2823
> Project: oVirt - virtualization made easy
>  Issue Type: By-EMAIL
>Reporter: Sandro Bonazzola
>Assignee: infra
>
> missing workspace
> /home/jenkins/workspace/ovirt-release_standard-check-patch/ovirt-release on
> vm0013.workers-phx.ovirt.org
> caused job to fail at
> https://jenkins.ovirt.org/blue/organizations/jenkins/ovirt-release_standard-check-patch/detail/ovirt-release_standard-check-patch/199/pipeline
> There's something deleting workspaces while they are in use.
> Please investigate.
> -- 
> Sandro Bonazzola



--
This message was sent by Atlassian Jira
(v1001.0.0-SNAPSHOT#100114)
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/3PKUV25ZLJ3VEPMKYB5J5HOZHEWGTNCJ/


Re: Proposing Marcin Sobczyk as VDSM infra maintainer

2019-11-12 Thread Marcin Sobczyk



On 11/12/19 9:56 AM, Dan Kenigsberg wrote:

On Mon, Nov 11, 2019 at 3:36 PM Nir Soffer  wrote:

On Thu, Nov 7, 2019 at 4:13 PM Martin Perina  wrote:

Hi,

Marcin has joined infra team more than a year ago and during this time he 
contributed a lot to VDSM packaging, improved automation and ported all infra 
team parts of VDSM (jsonrpc, ssl, vdms-client, hooks infra, ...) to Python 3. 
He is a very nice person to talk, is usually very responsive and takes care a 
lot about code quality.

So I'd like to propose Marcin as VDSM infra maintainer.

Please share your thoughts.

Marcin is practically vdsm infra maintainer for awhile now, so it will
be a good idea
to make this official.

I hope we can get more contributors to vdsm infra, having one
maintainer which is
also the only contributor is not enough for complicated project like vdsm.

Nir

This seems like a +2.
Congratulations, Marcin. This is a serious responsibility, use it carefully!

Thank you all for the nomination and your support!
I will do my best to fulfil the role of a VDSM maintainer.

Regards, Marcin



Dear infra, please add msobczyk to vdsm-maintainers group.


___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/47PCLTXDDHH653TOC36ZAXRHJQYFCJRO/


Re: Proposing Marcin Sobczyk as VDSM infra maintainer

2019-11-12 Thread Dan Kenigsberg
On Mon, Nov 11, 2019 at 3:36 PM Nir Soffer  wrote:
>
> On Thu, Nov 7, 2019 at 4:13 PM Martin Perina  wrote:
> >
> > Hi,
> >
> > Marcin has joined infra team more than a year ago and during this time he 
> > contributed a lot to VDSM packaging, improved automation and ported all 
> > infra team parts of VDSM (jsonrpc, ssl, vdms-client, hooks infra, ...) to 
> > Python 3. He is a very nice person to talk, is usually very responsive and 
> > takes care a lot about code quality.
> >
> > So I'd like to propose Marcin as VDSM infra maintainer.
> >
> > Please share your thoughts.
>
> Marcin is practically vdsm infra maintainer for awhile now, so it will
> be a good idea
> to make this official.
>
> I hope we can get more contributors to vdsm infra, having one
> maintainer which is
> also the only contributor is not enough for complicated project like vdsm.
>
> Nir

This seems like a +2.
Congratulations, Marcin. This is a serious responsibility, use it carefully!

Dear infra, please add msobczyk to vdsm-maintainers group.
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/FUG6P3FPHDV5EW3RF7DFUIVV4HOBJPPH/


[CQ]: 104562,1 (ovirt-imageio) failed "ovirt-master" system tests

2019-11-12 Thread oVirt Jenkins
Change 104562,1 (ovirt-imageio) is probably the reason behind recent system
test failures in the "ovirt-master" change queue and needs to be fixed.

This change had been removed from the testing queue. Artifacts build from this
change will not be released until it is fixed.

For further details about the change see:
https://gerrit.ovirt.org/#/c/104562/1

For failed test results see:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/16852/
___
Infra mailing list -- infra@ovirt.org
To unsubscribe send an email to infra-le...@ovirt.org
Privacy Statement: https://www.ovirt.org/site/privacy-policy/
oVirt Code of Conduct: 
https://www.ovirt.org/community/about/community-guidelines/
List Archives: 
https://lists.ovirt.org/archives/list/infra@ovirt.org/message/WPZ6JYWFRJAEXL6SOESV5WHIK7ALXM7D/