Re: [ovirt-devel] [VDSM] network test failure - failing all builds

2017-05-12 Thread Edward Haas
It may have been a patch (not particular one that got merged) that messed
up the ci hosts.
Looks better now, recent checks passed.

Will try to monitor it later on.

Thanks,
Edy.

On Sat, May 13, 2017 at 12:45 AM, Edward Haas  wrote:

> Very strange (and very late), trying to simulate locally.
>
> On Sat, May 13, 2017 at 12:22 AM, Nir Soffer  wrote:
>
>> Test trends:
>> http://jenkins.ovirt.org/job/vdsm_master_check-patch-fc25-x8
>> 6_64/buildTimeTrend
>> http://jenkins.ovirt.org/job/vdsm_master_check-patch-el7-x86
>> _64/buildTimeTrend
>> 
>>
>> On Sat, May 13, 2017 at 12:16 AM Nir Soffer  wrote:
>>
>>> I'm seeing now this failure on multiple untreated patches.
>>>
>>> Seems to be related to these patches merged today:
>>> https://gerrit.ovirt.org/#/q/topic:ip_refactoring+is:merged
>>>
>>> *21:04:36* 
>>> ==*21:04:36*
>>>  FAIL: test_add_delete_and_read_rule 
>>> (network.ip_rule_test.TestIpRule)*21:04:36* 
>>> --*21:04:36*
>>>  Traceback (most recent call last):*21:04:36*   File 
>>> "/home/jenkins/workspace/vdsm_master_check-patch-el7-x86_64/vdsm/tests/network/ip_rule_test.py",
>>>  line 47, in test_add_delete_and_read_rule*21:04:36* 
>>> self.assertEqual(1, len(rules))*21:04:36* AssertionError: 1 != 2*21:04:36* 
>>>  >> begin captured logging << 
>>> *21:04:36* 2017-05-12 21:04:18,381 DEBUG (MainThread) 
>>> [root] /usr/bin/taskset --cpu-list 0-1 /sbin/ip rule add from all to 
>>> 192.168.99.1 dev lo table main (cwd None) (commands:69)*21:04:36* 
>>> 2017-05-12 21:04:18,393 DEBUG (MainThread) [root] SUCCESS:  = '';  
>>> = 0 (commands:93)*21:04:36* 2017-05-12 21:04:18,393 DEBUG (MainThread) 
>>> [root] /usr/bin/taskset --cpu-list 0-1 /sbin/ip rule (cwd None) 
>>> (commands:69)*21:04:36* 2017-05-12 21:04:18,404 DEBUG (MainThread) [root] 
>>> SUCCESS:  = '';  = 0 (commands:93)*21:04:36* 2017-05-12 
>>> 21:04:18,406 DEBUG (MainThread) [root] /usr/bin/taskset --cpu-list 0-1 
>>> /sbin/ip rule del from all to 192.168.99.1 dev lo table main (cwd None) 
>>> (commands:69)*21:04:36* 2017-05-12 21:04:18,417 DEBUG (MainThread) [root] 
>>> SUCCESS:  = '';  = 0 (commands:93)*21:04:36* - 
>>> >> end captured logging << -
>>>
>>>
>>>
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] [VDSM] network test failure - failing all builds

2017-05-12 Thread Edward Haas
Very strange (and very late), trying to simulate locally.

On Sat, May 13, 2017 at 12:22 AM, Nir Soffer  wrote:

> Test trends:
> http://jenkins.ovirt.org/job/vdsm_master_check-patch-fc25-
> x86_64/buildTimeTrend
> http://jenkins.ovirt.org/job/vdsm_master_check-patch-el7-
> x86_64/buildTimeTrend
> 
>
> On Sat, May 13, 2017 at 12:16 AM Nir Soffer  wrote:
>
>> I'm seeing now this failure on multiple untreated patches.
>>
>> Seems to be related to these patches merged today:
>> https://gerrit.ovirt.org/#/q/topic:ip_refactoring+is:merged
>>
>> *21:04:36* 
>> ==*21:04:36*
>>  FAIL: test_add_delete_and_read_rule 
>> (network.ip_rule_test.TestIpRule)*21:04:36* 
>> --*21:04:36*
>>  Traceback (most recent call last):*21:04:36*   File 
>> "/home/jenkins/workspace/vdsm_master_check-patch-el7-x86_64/vdsm/tests/network/ip_rule_test.py",
>>  line 47, in test_add_delete_and_read_rule*21:04:36* self.assertEqual(1, 
>> len(rules))*21:04:36* AssertionError: 1 != 2*21:04:36*  
>> >> begin captured logging << *21:04:36* 2017-05-12 
>> 21:04:18,381 DEBUG (MainThread) [root] /usr/bin/taskset --cpu-list 0-1 
>> /sbin/ip rule add from all to 192.168.99.1 dev lo table main (cwd None) 
>> (commands:69)*21:04:36* 2017-05-12 21:04:18,393 DEBUG (MainThread) [root] 
>> SUCCESS:  = '';  = 0 (commands:93)*21:04:36* 2017-05-12 
>> 21:04:18,393 DEBUG (MainThread) [root] /usr/bin/taskset --cpu-list 0-1 
>> /sbin/ip rule (cwd None) (commands:69)*21:04:36* 2017-05-12 21:04:18,404 
>> DEBUG (MainThread) [root] SUCCESS:  = '';  = 0 
>> (commands:93)*21:04:36* 2017-05-12 21:04:18,406 DEBUG (MainThread) [root] 
>> /usr/bin/taskset --cpu-list 0-1 /sbin/ip rule del from all to 192.168.99.1 
>> dev lo table main (cwd None) (commands:69)*21:04:36* 2017-05-12 21:04:18,417 
>> DEBUG (MainThread) [root] SUCCESS:  = '';  = 0 
>> (commands:93)*21:04:36* - >> end captured logging << 
>> -
>>
>>
>>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] [VDSM] network test failure - failing all builds

2017-05-12 Thread Nir Soffer
Test trends:
http://jenkins.ovirt.org/job/vdsm_master_check-patch-fc25-x86_64/buildTimeTrend
http://jenkins.ovirt.org/job/vdsm_master_check-patch-el7-x86_64/buildTimeTrend


On Sat, May 13, 2017 at 12:16 AM Nir Soffer  wrote:

> I'm seeing now this failure on multiple untreated patches.
>
> Seems to be related to these patches merged today:
> https://gerrit.ovirt.org/#/q/topic:ip_refactoring+is:merged
>
> *21:04:36* 
> ==*21:04:36*
>  FAIL: test_add_delete_and_read_rule 
> (network.ip_rule_test.TestIpRule)*21:04:36* 
> --*21:04:36*
>  Traceback (most recent call last):*21:04:36*   File 
> "/home/jenkins/workspace/vdsm_master_check-patch-el7-x86_64/vdsm/tests/network/ip_rule_test.py",
>  line 47, in test_add_delete_and_read_rule*21:04:36* self.assertEqual(1, 
> len(rules))*21:04:36* AssertionError: 1 != 2*21:04:36*  
> >> begin captured logging << *21:04:36* 2017-05-12 
> 21:04:18,381 DEBUG (MainThread) [root] /usr/bin/taskset --cpu-list 0-1 
> /sbin/ip rule add from all to 192.168.99.1 dev lo table main (cwd None) 
> (commands:69)*21:04:36* 2017-05-12 21:04:18,393 DEBUG (MainThread) [root] 
> SUCCESS:  = '';  = 0 (commands:93)*21:04:36* 2017-05-12 21:04:18,393 
> DEBUG (MainThread) [root] /usr/bin/taskset --cpu-list 0-1 /sbin/ip rule (cwd 
> None) (commands:69)*21:04:36* 2017-05-12 21:04:18,404 DEBUG (MainThread) 
> [root] SUCCESS:  = '';  = 0 (commands:93)*21:04:36* 2017-05-12 
> 21:04:18,406 DEBUG (MainThread) [root] /usr/bin/taskset --cpu-list 0-1 
> /sbin/ip rule del from all to 192.168.99.1 dev lo table main (cwd None) 
> (commands:69)*21:04:36* 2017-05-12 21:04:18,417 DEBUG (MainThread) [root] 
> SUCCESS:  = '';  = 0 (commands:93)*21:04:36* - 
> >> end captured logging << -
>
>
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [VDSM] Fedora rawhide build - works!

2017-05-12 Thread Nir Soffer
Hi all,

We have now a  Fedora rawhide build on travis:
https://travis-ci.org/nirs/vdsm/builds/231649060

To make it work, I had to:

- disable kvm2ovirt import test, since ovirt-imageio-common
  is not available for rawhide on ovirt repos
  Building ovirt-imageio for rawhide should be easy

- disable 2 tests importing blivet, since importing it fail
  with ValueError on rawhide (blivet bug?)

The build is using ovirtorg/vdsm-test-fedora-rawhide
image.

Please review:
https://gerrit.ovirt.org/#/q/topic:rawhide

Cheers,
Nir
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] Vdsm merge rights

2017-05-12 Thread Dan Kenigsberg
infra-support, we have approval. Please add fromani to vdsm-maintainers.


On Fri, May 12, 2017 at 4:05 PM, Adam Litke  wrote:
> +2 :)
>
> On Fri, May 12, 2017 at 6:16 AM, Nir Soffer  wrote:
>>
>> +1
>>
>> בתאריך יום ו׳, 12 במאי 2017, 12:59, מאת Fabian Deutsch
>> ‏:
>>>
>>> +1
>>>
>>> On Fri, May 12, 2017 at 11:25 AM, Edward Haas  wrote:
>>> > Good news! +2
>>> >
>>> > On Fri, May 12, 2017 at 11:27 AM, Piotr Kliczewski
>>> > 
>>> > wrote:
>>> >>
>>> >> +1
>>> >>
>>> >> On Fri, May 12, 2017 at 9:14 AM, Dan Kenigsberg 
>>> >> wrote:
>>> >>>
>>> >>> I'd like to nominate Francesco to the vdsm-maintainers
>>> >>>
>>> >>>
>>> >>> https://gerrit.ovirt.org/#/admin/groups/uuid-becbf722723417c336de6c1646749678acae8b89
>>> >>> list, so he can merge patches without waiting for Nir, Adam or me.
>>> >>>
>>> >>> I believe that he proved to be thorough and considerate (and
>>> >>> paranoid)
>>> >>> as the job requires.
>>> >>>
>>> >>> Vdsm maintainers, please approve.
>>> >>>
>>> >>> Dan
>>> >>
>>> >>
>>> >
>>> >
>>> > ___
>>> > Devel mailing list
>>> > Devel@ovirt.org
>>> > http://lists.ovirt.org/mailman/listinfo/devel
>>> ___
>>> Devel mailing list
>>> Devel@ovirt.org
>>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>>
>> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>
>
>
>
> --
> Adam Litke
>
> ___
> Devel mailing list
> Devel@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Fwd: UPDATE: Bugzilla 5 beta window extended

2017-05-12 Thread Sandro Bonazzola
FYI

-- Forwarded message --
From: Christine Freitas 
Date: Thu, May 11, 2017 at 6:05 PM
Subject: UPDATE: Bugzilla 5 beta window extended

Hello All,

Due to issues found during testing, the Bugzilla 5 beta server [1] was put
into maintenance mode. Therefore, we are extending the beta window by 4
weeks, once the server comes back online. We are targeting early next week
to have the beta available to you.

Once the server comes back online, we will update you on the date by which
feedback is due.

We sincerely apologize for any inconvenience, and look forward to your
feedback once we are back online.

Cheers, the Red Hat Bugzilla team

For more information on the Bugzilla 5 beta refer to:

https://beta-bugzilla.redhat.com/page.cgi?id=whats-new.html

https://beta-bugzilla.redhat.com/page.cgi?id=release-notes.html

https://beta-bugzilla.redhat.com/page.cgi?id=faq.html

https://beta-bugzilla.redhat.com/docs/en/html/using/index.html

https://beta-bugzilla.redhat.com/docs/en/html/api/index.html

Cheers, the Red Hat Bugzilla team.

1: https://beta-bugzilla.redhat.com/




-- 

SANDRO BONAZZOLA

ASSOCIATE MANAGER, SOFTWARE ENGINEERING, EMEA ENG VIRTUALIZATION R

Red Hat EMEA 

TRIED. TESTED. TRUSTED. 
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Barak Korren
On 12 May 2017 at 16:00, Eyal Edri  wrote:
>
>> Maybe, I guess we can run manual with it and add newer SDK to
>> extra_sources then if we see it is getting installed in mock we know
>> it works.
>>
>> But do we have fixed SDK for 4.1 ?
>
> Yes, should be on latest experimental.
> The patch was merged:
>
> https://gerrit.ovirt.org/#/c/76714/
>
And do we still have a broken sdk package in "testing" for 4.1?

If so please try running manual for 4.1 with my patch.


-- 
Barak Korren
RHV DevOps team , RHCE, RHCi
Red Hat EMEA
redhat.com | TRIED. TESTED. TRUSTED. | redhat.com/trusted
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


Re: [ovirt-devel] Vdsm merge rights

2017-05-12 Thread Adam Litke
+2 :)

On Fri, May 12, 2017 at 6:16 AM, Nir Soffer  wrote:

> +1
>
> בתאריך יום ו׳, 12 במאי 2017, 12:59, מאת Fabian Deutsch ‏<
> fdeut...@redhat.com>:
>
>> +1
>>
>> On Fri, May 12, 2017 at 11:25 AM, Edward Haas  wrote:
>> > Good news! +2
>> >
>> > On Fri, May 12, 2017 at 11:27 AM, Piotr Kliczewski > >
>> > wrote:
>> >>
>> >> +1
>> >>
>> >> On Fri, May 12, 2017 at 9:14 AM, Dan Kenigsberg 
>> wrote:
>> >>>
>> >>> I'd like to nominate Francesco to the vdsm-maintainers
>> >>>
>> >>> https://gerrit.ovirt.org/#/admin/groups/uuid-
>> becbf722723417c336de6c1646749678acae8b89
>> >>> list, so he can merge patches without waiting for Nir, Adam or me.
>> >>>
>> >>> I believe that he proved to be thorough and considerate (and paranoid)
>> >>> as the job requires.
>> >>>
>> >>> Vdsm maintainers, please approve.
>> >>>
>> >>> Dan
>> >>
>> >>
>> >
>> >
>> > ___
>> > Devel mailing list
>> > Devel@ovirt.org
>> > http://lists.ovirt.org/mailman/listinfo/devel
>> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>
> ___
> Devel mailing list
> Devel@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
>



-- 
Adam Litke
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Eyal Edri
On Fri, May 12, 2017 at 3:23 PM, Barak Korren  wrote:

> On 12 May 2017 at 15:08, Eyal Edri  wrote:
> >
> >
> >
> > Yes, removed it from both.
> > However, experimental then run on a new slave which I didn't clean and
> also passed.
>
>
> I guess we just got lucky there
>
> >
> > Great!
> > Can we verify it fixes 4.1 ?
>
> Maybe, I guess we can run manual with it and add newer SDK to
> extra_sources then if we see it is getting installed in mock we know
> it works.
>
> But do we have fixed SDK for 4.1 ?
>

Yes, should be on latest experimental.
The patch was merged:

https://gerrit.ovirt.org/#/c/76714/


>
> --
> Barak Korren
> RHV DevOps team , RHCE, RHCi
> Red Hat EMEA
> redhat.com | TRIED. TESTED. TRUSTED. | redhat.com/trusted
>



-- 

Eyal edri


ASSOCIATE MANAGER

RHV DevOps

EMEA VIRTUALIZATION R


Red Hat EMEA 
 TRIED. TESTED. TRUSTED. 
phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Eyal Edri
On Fri, May 12, 2017 at 3:06 PM, Barak Korren  wrote:

>
>
> On 12 May 2017 at 14:24, Eyal Edri  wrote:
>
>> BTW,
>> master experimental was fixed by removing the old broken pkg from the
>> repo.
>> If we know the root cause of the error, we can test the fix on the 4.1
>> job now.
>>
>
> from where id you remove it? It made it into 'tested' so, you probably
> also needed to remove it from the local cache on the slave.
>

Yes, removed it from both.
However, experimental then run on a new slave which I didn't clean and also
passed.


>
> The root cause was the bad SDK package.
>
> But it also uncovered the truth that daniel's new patch was not really
> working and not taking the SDK from experimental. This made the fixed SDK
> not come into play.
>
> I've a fix patch here:
> https://gerrit.ovirt.org/#/c/76765/
>

Great!
Can we verify it fixes 4.1 ?


>
>
> --
> Barak Korren
> RHV DevOps team , RHCE, RHCi
> Red Hat EMEA
> redhat.com | TRIED. TESTED. TRUSTED. | redhat.com/trusted
>



-- 

Eyal edri


ASSOCIATE MANAGER

RHV DevOps

EMEA VIRTUALIZATION R


Red Hat EMEA 
 TRIED. TESTED. TRUSTED. 
phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Barak Korren
On 12 May 2017 at 14:24, Eyal Edri  wrote:

> BTW,
> master experimental was fixed by removing the old broken pkg from the repo.
> If we know the root cause of the error, we can test the fix on the 4.1 job
> now.
>

from where id you remove it? It made it into 'tested' so, you probably also
needed to remove it from the local cache on the slave.

The root cause was the bad SDK package.

But it also uncovered the truth that daniel's new patch was not really
working and not taking the SDK from experimental. This made the fixed SDK
not come into play.

I've a fix patch here:
https://gerrit.ovirt.org/#/c/76765/

-- 
Barak Korren
RHV DevOps team , RHCE, RHCi
Red Hat EMEA
redhat.com | TRIED. TESTED. TRUSTED. | redhat.com/trusted
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Barak Korren
On 12 May 2017 at 14:02, Daniel Belenky  wrote:
> So repoman pulls both of the versions to the internal repo? I think we're
> running repoman with only latest flag...
>
No, the older version comes from 'tested'.


-- 
Barak Korren
RHV DevOps team , RHCE, RHCi
Red Hat EMEA
redhat.com | TRIED. TESTED. TRUSTED. | redhat.com/trusted
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Barak Korren
Ok I found the issue:

PATH_TO_CONFIG=/etc/yum.repos.d/internal.repo


'/etc/um.repos.d' is intentionally disabled in mock. The configuration
should've been placed directly in /etc/yum/yum.comf/




On 12 May 2017 at 13:43, Barak Korren  wrote:
> So, yum is installing the older version even though it has a newer one
> visible in a repo it is configured to use? I guess its not reading the
> updated repodata then.
> We need to try and add 'yum clean metadata' after we configure the
> localrepo in the mock environment.
>
> On 12 May 2017 at 12:29, Anton Marchukov  wrote:
>> Hello Barak.
>>
>> Yes. repoman pulls the latest version and that version is in latest and
>> latest.under_test on resources. Additionally it is proven by lago.log too.
>>
>> The only problem seems to be the mock env that runs the python itself.
>>
>> Anton.
>>
>> On Fri, May 12, 2017 at 11:03 AM, Barak Korren  wrote:
>>>
>>> Anton, are you seeing reponan pull the right version in the lago logs? We
>>> need to know if it makes it into the Lago local repo or not.
>>>
>>> Barak Korren
>>> bkor...@redhat.com
>>> RHCE, RHCi, RHV-DevOps Team
>>> https://ifireball.wordpress.com/
>>>
>>> בתאריך 12 במאי 2017 11:13,‏ "Anton Marchukov"  כתב:

 Hello Ondra.

 Yes I see it installs the old version, e.g. the latest master run at [1]
 installs:

 07:43:13 [basic_suit_el7] Updated:
 07:43:13 [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64
 0:4.2.0-1.a0.20170511git210c375.el7.centos


 while the latest version is indeed
 python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm

 Just for the record: latest and latest.under_test have correct version of
 the package, so it does not look to be a repoman bug.

 Checking OST sources now...

 [1]
 http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/6651/consoleFull

 On Fri, May 12, 2017 at 9:43 AM, Ondra Machacek 
 wrote:
>
> Hello Anton,
>
> So I've bumped the version, but it's still installing the old one.
> The bumped version:
>
>
> python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm
>
> Log from OST run:
>
> 07:25:59 [upgrade-from-release_suit_el7]
> 
> 07:25:59 [upgrade-from-release_suit_el7]  Package  Arch
> VersionRepository Size
> 07:25:59 [upgrade-from-release_suit_el7]
> 
> 07:25:59 [upgrade-from-release_suit_el7] Installing:
> 07:25:59 [upgrade-from-release_suit_el7]  python-ovirt-engine-sdk4
> x86_64 4.2.0-1.a0.20170511git210c375.el7.centos
> 07:25:59 [upgrade-from-release_suit_el7]
> ovirt-master-snapshot 446 k
> 07:25:59 [upgrade-from-release_suit_el7] Installing for dependencies:
> 07:25:59 [upgrade-from-release_suit_el7]  python-enum34
> noarch 1.0.4-1.el7centos-base-el752 k
> 07:25:59 [upgrade-from-release_suit_el7]
> 07:25:59 [upgrade-from-release_suit_el7] Transaction Summary
> 07:25:59 [upgrade-from-release_suit_el7]
> 
>
>
> On Thu, May 11, 2017 at 8:35 PM, Anton Marchukov 
> wrote:
>>
>> Hello Ondra.
>>
>> Thanks.
>>
>> It seems that the manual job populates SDK from custom repo only for
>> the VMs under test, but the mock where the python test code runs does not
>> use it from there. So the release of bumped version will be good idea.
>>
>> Anton.
>>
>> On Thu, May 11, 2017 at 8:20 PM, Ondra Machacek 
>> wrote:
>>>
>>>
>>>
>>> On Thu, May 11, 2017 at 8:11 PM, Anton Marchukov 
>>> wrote:

 On Thu, May 11, 2017 at 8:03 PM, Ondra Machacek 
 wrote:
>
>
>> 15:50:44 [basic_suit_el7] Updated:
>>
>> 15:50:44 [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64
>> 0:4.2.0-1.a0.20170511git210c375.el7.centos
>
>
> This is incorrect version. The correct one is:
>
>
> python-ovirt-engine-sdk4-4.2.0-1.a0.20170511gitcd0adb4.el7.centos.x86_64.rpm
>
> From this build:
>
>
> http://jenkins.ovirt.org/job/python-ovirt-engine-sdk4_master_build-artifacts-el7-x86_64/71/



 Sounds like we have a problem if the version different only by git
 hashes. They are not ordered.

 I suggest we just merge the version bump at
 https://gerrit.ovirt.org/#/c/76732/ and then see 

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Daniel Belenky
So repoman pulls both of the versions to the internal repo? I think we're
running repoman with only latest flag...

On May 12, 2017 12:29 PM, "Anton Marchukov"  wrote:

> Hello Barak.
>
> Yes. repoman pulls the latest version and that version is in latest and
> latest.under_test on resources. Additionally it is proven by lago.log too.
>
> The only problem seems to be the mock env that runs the python itself.
>
> Anton.
>
> On Fri, May 12, 2017 at 11:03 AM, Barak Korren  wrote:
>
>> Anton, are you seeing reponan pull the right version in the lago logs? We
>> need to know if it makes it into the Lago local repo or not.
>>
>> Barak Korren
>> bkor...@redhat.com
>> RHCE, RHCi, RHV-DevOps Team
>> https://ifireball.wordpress.com/
>>
>> בתאריך 12 במאי 2017 11:13,‏ "Anton Marchukov"  כתב:
>>
>>> Hello Ondra.
>>>
>>> Yes I see it installs the old version, e.g. the latest master run at [1]
>>> installs:
>>>
>>> *07:43:13* [basic_suit_el7] Updated:*07:43:13* [basic_suit_el7]   
>>> python-ovirt-engine-sdk4.x86_64 0:4.2.0-1.a0.20170511git210c375.el7.centos
>>>
>>>
>>> while the latest version is indeed  python-ovirt-engine-sdk4-4.2.
>>> 0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm
>>>
>>> Just for the record: latest and latest.under_test have correct version
>>> of the package, so it does not look to be a repoman bug.
>>>
>>> Checking OST sources now...
>>>
>>> [1] http://jenkins.ovirt.org/job/test-repo_ovirt_experimenta
>>> l_master/6651/consoleFull
>>>
>>> On Fri, May 12, 2017 at 9:43 AM, Ondra Machacek 
>>> wrote:
>>>
 Hello Anton,

 So I've bumped the version, but it's still installing the old one.
 The bumped version:

  python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.
 centos.x86_64.rpm
 

 Log from OST run:

 *07:25:59* [upgrade-from-release_suit_el7] 
 *07:25:59*
  [upgrade-from-release_suit_el7]  Package  Arch   Version  
   Repository Size*07:25:59* 
 [upgrade-from-release_suit_el7] 
 *07:25:59*
  [upgrade-from-release_suit_el7] Installing:*07:25:59* 
 [upgrade-from-release_suit_el7]  python-ovirt-engine-sdk4 x86_64 
 4.2.0-1.a0.20170511git210c375.el7.centos*07:25:59* 
 [upgrade-from-release_suit_el7]
  ovirt-master-snapshot 446 k*07:25:59* 
 [upgrade-from-release_suit_el7] Installing for dependencies:*07:25:59* 
 [upgrade-from-release_suit_el7]  python-enum34noarch 
 1.0.4-1.el7centos-base-el752 k*07:25:59* 
 [upgrade-from-release_suit_el7] *07:25:59* [upgrade-from-release_suit_el7] 
 Transaction Summary*07:25:59* [upgrade-from-release_suit_el7] 
 


 On Thu, May 11, 2017 at 8:35 PM, Anton Marchukov 
 wrote:

> Hello Ondra.
>
> Thanks.
>
> It seems that the manual job populates SDK from custom repo only for
> the VMs under test, but the mock where the python test code runs does not
> use it from there. So the release of bumped version will be good idea.
>
> Anton.
>
> On Thu, May 11, 2017 at 8:20 PM, Ondra Machacek 
> wrote:
>
>>
>>
>> On Thu, May 11, 2017 at 8:11 PM, Anton Marchukov > > wrote:
>>
>>> On Thu, May 11, 2017 at 8:03 PM, Ondra Machacek >> > wrote:
>>>

 *15:50:44* [basic_suit_el7] Updated:
>
> *15:50:44* [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64 
> 0:4.2.0-1.a0.20170511git210c375.el7.centos
>
>
 This is incorrect version. The correct one is:

  python-ovirt-engine-sdk4-4.2.0-1.a0.20170511gitcd0adb4.el7.
 centos.x86_64.rpm
 

 From this build:

  http://jenkins.ovirt.org/job/python-ovirt-engine-sdk4_maste
 r_build-artifacts-el7-x86_64/71/

>>>
>>>
>>> Sounds like we have a problem if the version different only by git
>>> hashes. They are not ordered.
>>>
>>> I suggest we just merge the version bump at
>>> https://gerrit.ovirt.org/#/c/76732/ and then see which version it
>>> will install.

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Barak Korren
So, yum is installing the older version even though it has a newer one
visible in a repo it is configured to use? I guess its not reading the
updated repodata then.
We need to try and add 'yum clean metadata' after we configure the
localrepo in the mock environment.

On 12 May 2017 at 12:29, Anton Marchukov  wrote:
> Hello Barak.
>
> Yes. repoman pulls the latest version and that version is in latest and
> latest.under_test on resources. Additionally it is proven by lago.log too.
>
> The only problem seems to be the mock env that runs the python itself.
>
> Anton.
>
> On Fri, May 12, 2017 at 11:03 AM, Barak Korren  wrote:
>>
>> Anton, are you seeing reponan pull the right version in the lago logs? We
>> need to know if it makes it into the Lago local repo or not.
>>
>> Barak Korren
>> bkor...@redhat.com
>> RHCE, RHCi, RHV-DevOps Team
>> https://ifireball.wordpress.com/
>>
>> בתאריך 12 במאי 2017 11:13,‏ "Anton Marchukov"  כתב:
>>>
>>> Hello Ondra.
>>>
>>> Yes I see it installs the old version, e.g. the latest master run at [1]
>>> installs:
>>>
>>> 07:43:13 [basic_suit_el7] Updated:
>>> 07:43:13 [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64
>>> 0:4.2.0-1.a0.20170511git210c375.el7.centos
>>>
>>>
>>> while the latest version is indeed
>>> python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm
>>>
>>> Just for the record: latest and latest.under_test have correct version of
>>> the package, so it does not look to be a repoman bug.
>>>
>>> Checking OST sources now...
>>>
>>> [1]
>>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/6651/consoleFull
>>>
>>> On Fri, May 12, 2017 at 9:43 AM, Ondra Machacek 
>>> wrote:

 Hello Anton,

 So I've bumped the version, but it's still installing the old one.
 The bumped version:


 python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm

 Log from OST run:

 07:25:59 [upgrade-from-release_suit_el7]
 
 07:25:59 [upgrade-from-release_suit_el7]  Package  Arch
 VersionRepository Size
 07:25:59 [upgrade-from-release_suit_el7]
 
 07:25:59 [upgrade-from-release_suit_el7] Installing:
 07:25:59 [upgrade-from-release_suit_el7]  python-ovirt-engine-sdk4
 x86_64 4.2.0-1.a0.20170511git210c375.el7.centos
 07:25:59 [upgrade-from-release_suit_el7]
 ovirt-master-snapshot 446 k
 07:25:59 [upgrade-from-release_suit_el7] Installing for dependencies:
 07:25:59 [upgrade-from-release_suit_el7]  python-enum34
 noarch 1.0.4-1.el7centos-base-el752 k
 07:25:59 [upgrade-from-release_suit_el7]
 07:25:59 [upgrade-from-release_suit_el7] Transaction Summary
 07:25:59 [upgrade-from-release_suit_el7]
 


 On Thu, May 11, 2017 at 8:35 PM, Anton Marchukov 
 wrote:
>
> Hello Ondra.
>
> Thanks.
>
> It seems that the manual job populates SDK from custom repo only for
> the VMs under test, but the mock where the python test code runs does not
> use it from there. So the release of bumped version will be good idea.
>
> Anton.
>
> On Thu, May 11, 2017 at 8:20 PM, Ondra Machacek 
> wrote:
>>
>>
>>
>> On Thu, May 11, 2017 at 8:11 PM, Anton Marchukov 
>> wrote:
>>>
>>> On Thu, May 11, 2017 at 8:03 PM, Ondra Machacek 
>>> wrote:


> 15:50:44 [basic_suit_el7] Updated:
>
> 15:50:44 [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64
> 0:4.2.0-1.a0.20170511git210c375.el7.centos


 This is incorrect version. The correct one is:


 python-ovirt-engine-sdk4-4.2.0-1.a0.20170511gitcd0adb4.el7.centos.x86_64.rpm

 From this build:


 http://jenkins.ovirt.org/job/python-ovirt-engine-sdk4_master_build-artifacts-el7-x86_64/71/
>>>
>>>
>>>
>>> Sounds like we have a problem if the version different only by git
>>> hashes. They are not ordered.
>>>
>>> I suggest we just merge the version bump at
>>> https://gerrit.ovirt.org/#/c/76732/ and then see which version it will
>>> install.
>>>
>>> Any objections to that?
>>
>>
>> OK, I will do a proper release.
>>
>>>
>>>
>>> --
>>> Anton Marchukov
>>> Senior Software Engineer - RHEV CI - Red Hat
>>>
>>
>
>
>
> --
> Anton Marchukov
> Senior Software Engineer - RHEV CI - Red Hat
>

>>>
>>>
>>>
>>> --
>>> Anton 

Re: [ovirt-devel] Vdsm merge rights

2017-05-12 Thread Nir Soffer
+1

בתאריך יום ו׳, 12 במאי 2017, 12:59, מאת Fabian Deutsch ‏:

> +1
>
> On Fri, May 12, 2017 at 11:25 AM, Edward Haas  wrote:
> > Good news! +2
> >
> > On Fri, May 12, 2017 at 11:27 AM, Piotr Kliczewski 
> > wrote:
> >>
> >> +1
> >>
> >> On Fri, May 12, 2017 at 9:14 AM, Dan Kenigsberg 
> wrote:
> >>>
> >>> I'd like to nominate Francesco to the vdsm-maintainers
> >>>
> >>>
> https://gerrit.ovirt.org/#/admin/groups/uuid-becbf722723417c336de6c1646749678acae8b89
> >>> list, so he can merge patches without waiting for Nir, Adam or me.
> >>>
> >>> I believe that he proved to be thorough and considerate (and paranoid)
> >>> as the job requires.
> >>>
> >>> Vdsm maintainers, please approve.
> >>>
> >>> Dan
> >>
> >>
> >
> >
> > ___
> > Devel mailing list
> > Devel@ovirt.org
> > http://lists.ovirt.org/mailman/listinfo/devel
> ___
> Devel mailing list
> Devel@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] Vdsm merge rights

2017-05-12 Thread Fabian Deutsch
+1

On Fri, May 12, 2017 at 11:25 AM, Edward Haas  wrote:
> Good news! +2
>
> On Fri, May 12, 2017 at 11:27 AM, Piotr Kliczewski 
> wrote:
>>
>> +1
>>
>> On Fri, May 12, 2017 at 9:14 AM, Dan Kenigsberg  wrote:
>>>
>>> I'd like to nominate Francesco to the vdsm-maintainers
>>>
>>> https://gerrit.ovirt.org/#/admin/groups/uuid-becbf722723417c336de6c1646749678acae8b89
>>> list, so he can merge patches without waiting for Nir, Adam or me.
>>>
>>> I believe that he proved to be thorough and considerate (and paranoid)
>>> as the job requires.
>>>
>>> Vdsm maintainers, please approve.
>>>
>>> Dan
>>
>>
>
>
> ___
> Devel mailing list
> Devel@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Anton Marchukov
Hello Barak.

Yes. repoman pulls the latest version and that version is in latest and
latest.under_test on resources. Additionally it is proven by lago.log too.

The only problem seems to be the mock env that runs the python itself.

Anton.

On Fri, May 12, 2017 at 11:03 AM, Barak Korren  wrote:

> Anton, are you seeing reponan pull the right version in the lago logs? We
> need to know if it makes it into the Lago local repo or not.
>
> Barak Korren
> bkor...@redhat.com
> RHCE, RHCi, RHV-DevOps Team
> https://ifireball.wordpress.com/
>
> בתאריך 12 במאי 2017 11:13,‏ "Anton Marchukov"  כתב:
>
>> Hello Ondra.
>>
>> Yes I see it installs the old version, e.g. the latest master run at [1]
>> installs:
>>
>> *07:43:13* [basic_suit_el7] Updated:*07:43:13* [basic_suit_el7]   
>> python-ovirt-engine-sdk4.x86_64 0:4.2.0-1.a0.20170511git210c375.el7.centos
>>
>>
>> while the latest version is indeed  python-ovirt-engine-sdk4-4.2.
>> 0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm
>>
>> Just for the record: latest and latest.under_test have correct version of
>> the package, so it does not look to be a repoman bug.
>>
>> Checking OST sources now...
>>
>> [1] http://jenkins.ovirt.org/job/test-repo_ovirt_experimenta
>> l_master/6651/consoleFull
>>
>> On Fri, May 12, 2017 at 9:43 AM, Ondra Machacek 
>> wrote:
>>
>>> Hello Anton,
>>>
>>> So I've bumped the version, but it's still installing the old one.
>>> The bumped version:
>>>
>>>  python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.
>>> centos.x86_64.rpm
>>> 
>>>
>>> Log from OST run:
>>>
>>> *07:25:59* [upgrade-from-release_suit_el7] 
>>> *07:25:59*
>>>  [upgrade-from-release_suit_el7]  Package  Arch   Version   
>>>  Repository Size*07:25:59* 
>>> [upgrade-from-release_suit_el7] 
>>> *07:25:59*
>>>  [upgrade-from-release_suit_el7] Installing:*07:25:59* 
>>> [upgrade-from-release_suit_el7]  python-ovirt-engine-sdk4 x86_64 
>>> 4.2.0-1.a0.20170511git210c375.el7.centos*07:25:59* 
>>> [upgrade-from-release_suit_el7] 
>>> ovirt-master-snapshot 446 k*07:25:59* 
>>> [upgrade-from-release_suit_el7] Installing for dependencies:*07:25:59* 
>>> [upgrade-from-release_suit_el7]  python-enum34noarch 
>>> 1.0.4-1.el7centos-base-el752 k*07:25:59* 
>>> [upgrade-from-release_suit_el7] *07:25:59* [upgrade-from-release_suit_el7] 
>>> Transaction Summary*07:25:59* [upgrade-from-release_suit_el7] 
>>> 
>>>
>>>
>>> On Thu, May 11, 2017 at 8:35 PM, Anton Marchukov 
>>> wrote:
>>>
 Hello Ondra.

 Thanks.

 It seems that the manual job populates SDK from custom repo only for
 the VMs under test, but the mock where the python test code runs does not
 use it from there. So the release of bumped version will be good idea.

 Anton.

 On Thu, May 11, 2017 at 8:20 PM, Ondra Machacek 
 wrote:

>
>
> On Thu, May 11, 2017 at 8:11 PM, Anton Marchukov 
> wrote:
>
>> On Thu, May 11, 2017 at 8:03 PM, Ondra Machacek 
>> wrote:
>>
>>>
>>> *15:50:44* [basic_suit_el7] Updated:

 *15:50:44* [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64 
 0:4.2.0-1.a0.20170511git210c375.el7.centos


>>> This is incorrect version. The correct one is:
>>>
>>>  python-ovirt-engine-sdk4-4.2.0-1.a0.20170511gitcd0adb4.el7.
>>> centos.x86_64.rpm
>>> 
>>>
>>> From this build:
>>>
>>>  http://jenkins.ovirt.org/job/python-ovirt-engine-sdk4_maste
>>> r_build-artifacts-el7-x86_64/71/
>>>
>>
>>
>> Sounds like we have a problem if the version different only by git
>> hashes. They are not ordered.
>>
>> I suggest we just merge the version bump at
>> https://gerrit.ovirt.org/#/c/76732/ and then see which version it
>> will install.
>>
>> Any objections to that?
>>
>
> OK, I will do a proper release.
>
>
>>
>> --
>> Anton Marchukov
>> Senior Software Engineer - RHEV CI - Red Hat
>>
>>
>


 --
 Anton Marchukov
 Senior Software Engineer - RHEV CI - Red Hat


>>>
>>
>>

Re: [ovirt-devel] Vdsm merge rights

2017-05-12 Thread Edward Haas
Good news! +2

On Fri, May 12, 2017 at 11:27 AM, Piotr Kliczewski 
wrote:

> +1
>
> On Fri, May 12, 2017 at 9:14 AM, Dan Kenigsberg  wrote:
>
>> I'd like to nominate Francesco to the vdsm-maintainers
>> https://gerrit.ovirt.org/#/admin/groups/uuid-becbf722723417c
>> 336de6c1646749678acae8b89
>> list, so he can merge patches without waiting for Nir, Adam or me.
>>
>> I believe that he proved to be thorough and considerate (and paranoid)
>> as the job requires.
>>
>> Vdsm maintainers, please approve.
>>
>> Dan
>>
>
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Barak Korren
Anton, are you seeing reponan pull the right version in the lago logs? We
need to know if it makes it into the Lago local repo or not.

Barak Korren
bkor...@redhat.com
RHCE, RHCi, RHV-DevOps Team
https://ifireball.wordpress.com/

בתאריך 12 במאי 2017 11:13,‏ "Anton Marchukov"  כתב:

> Hello Ondra.
>
> Yes I see it installs the old version, e.g. the latest master run at [1]
> installs:
>
> *07:43:13* [basic_suit_el7] Updated:*07:43:13* [basic_suit_el7]   
> python-ovirt-engine-sdk4.x86_64 0:4.2.0-1.a0.20170511git210c375.el7.centos
>
>
> while the latest version is indeed  python-ovirt-engine-sdk4-4.2.
> 0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm
>
> Just for the record: latest and latest.under_test have correct version of
> the package, so it does not look to be a repoman bug.
>
> Checking OST sources now...
>
> [1] http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/6651/
> consoleFull
>
> On Fri, May 12, 2017 at 9:43 AM, Ondra Machacek 
> wrote:
>
>> Hello Anton,
>>
>> So I've bumped the version, but it's still installing the old one.
>> The bumped version:
>>
>>  python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.
>> centos.x86_64.rpm
>> 
>>
>> Log from OST run:
>>
>> *07:25:59* [upgrade-from-release_suit_el7] 
>> *07:25:59*
>>  [upgrade-from-release_suit_el7]  Package  Arch   Version
>> Repository Size*07:25:59* 
>> [upgrade-from-release_suit_el7] 
>> *07:25:59*
>>  [upgrade-from-release_suit_el7] Installing:*07:25:59* 
>> [upgrade-from-release_suit_el7]  python-ovirt-engine-sdk4 x86_64 
>> 4.2.0-1.a0.20170511git210c375.el7.centos*07:25:59* 
>> [upgrade-from-release_suit_el7]  
>>ovirt-master-snapshot 446 k*07:25:59* [upgrade-from-release_suit_el7] 
>> Installing for dependencies:*07:25:59* [upgrade-from-release_suit_el7]  
>> python-enum34noarch 1.0.4-1.el7centos-base-el752 
>> k*07:25:59* [upgrade-from-release_suit_el7] *07:25:59* 
>> [upgrade-from-release_suit_el7] Transaction Summary*07:25:59* 
>> [upgrade-from-release_suit_el7] 
>> 
>>
>>
>> On Thu, May 11, 2017 at 8:35 PM, Anton Marchukov 
>> wrote:
>>
>>> Hello Ondra.
>>>
>>> Thanks.
>>>
>>> It seems that the manual job populates SDK from custom repo only for the
>>> VMs under test, but the mock where the python test code runs does not use
>>> it from there. So the release of bumped version will be good idea.
>>>
>>> Anton.
>>>
>>> On Thu, May 11, 2017 at 8:20 PM, Ondra Machacek 
>>> wrote:
>>>


 On Thu, May 11, 2017 at 8:11 PM, Anton Marchukov 
 wrote:

> On Thu, May 11, 2017 at 8:03 PM, Ondra Machacek 
> wrote:
>
>>
>> *15:50:44* [basic_suit_el7] Updated:
>>>
>>> *15:50:44* [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64 
>>> 0:4.2.0-1.a0.20170511git210c375.el7.centos
>>>
>>>
>> This is incorrect version. The correct one is:
>>
>>  python-ovirt-engine-sdk4-4.2.0-1.a0.20170511gitcd0adb4.el7.
>> centos.x86_64.rpm
>> 
>>
>> From this build:
>>
>>  http://jenkins.ovirt.org/job/python-ovirt-engine-sdk4_maste
>> r_build-artifacts-el7-x86_64/71/
>>
>
>
> Sounds like we have a problem if the version different only by git
> hashes. They are not ordered.
>
> I suggest we just merge the version bump at
> https://gerrit.ovirt.org/#/c/76732/ and then see which version it
> will install.
>
> Any objections to that?
>

 OK, I will do a proper release.


>
> --
> Anton Marchukov
> Senior Software Engineer - RHEV CI - Red Hat
>
>

>>>
>>>
>>> --
>>> Anton Marchukov
>>> Senior Software Engineer - RHEV CI - Red Hat
>>>
>>>
>>
>
>
> --
> Anton Marchukov
> Senior Software Engineer - RHEV CI - Red Hat
>
>
> ___
> Devel mailing list
> Devel@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] DBUnit upgrade and format change

2017-05-12 Thread Allon Mureinik
Hi all,

In an attempt to make maintaining Engine's DAO tests a bit more manageable,
yesterday I've merged patches to upgrade the DBUnit version we're using to
the latest available version, 2.5.3 [1], and changed the format to flat-XML
[2], ending up with a fixures.xml file which is almost 10K lines shorter.

What do you need to know about this change? Frankly, not much.

The format itself is pretty self-descriptive. To add a row to a table in
the test database, just add a row to fixtures.xml: 
Note that you can pretty much add such a row wherever you want in the file,
but in the interest of maintainability, please group rows from the same
tables together, and leave an empty line between tables (as the file
currently is).

Note also that DBUnit 2.5.3 doesn't seem to allow timezones, offsets or
whitespaces at the end of timestamp values. Since our tests don't really
depend on them, I just removed the handful of instances we had. If anyone
wants to take a deeper look, I'd gladly review it :-)


If this change causes trouble to anyone (it shouldn't - it was tested both
on my machine and in the CI, several times), please let me know.


You friendly neighborhood cleanup guy,
Allon

[1] https://gerrit.ovirt.org/#/c/76678/
[2] https://gerrit.ovirt.org/#/c/76679/
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] Vdsm merge rights

2017-05-12 Thread Piotr Kliczewski
+1

On Fri, May 12, 2017 at 9:14 AM, Dan Kenigsberg  wrote:

> I'd like to nominate Francesco to the vdsm-maintainers
> https://gerrit.ovirt.org/#/admin/groups/uuid-
> becbf722723417c336de6c1646749678acae8b89
> list, so he can merge patches without waiting for Nir, Adam or me.
>
> I believe that he proved to be thorough and considerate (and paranoid)
> as the job requires.
>
> Vdsm maintainers, please approve.
>
> Dan
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Anton Marchukov
Hello Ondra.

Yes I see it installs the old version, e.g. the latest master run at [1]
installs:

*07:43:13* [basic_suit_el7] Updated:*07:43:13* [basic_suit_el7]
python-ovirt-engine-sdk4.x86_64
0:4.2.0-1.a0.20170511git210c375.el7.centos


while the latest version is indeed
 python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm

Just for the record: latest and latest.under_test have correct version of
the package, so it does not look to be a repoman bug.

Checking OST sources now...

[1]
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/6651/consoleFull

On Fri, May 12, 2017 at 9:43 AM, Ondra Machacek  wrote:

> Hello Anton,
>
> So I've bumped the version, but it's still installing the old one.
> The bumped version:
>
>  python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.
> centos.x86_64.rpm
> 
>
> Log from OST run:
>
> *07:25:59* [upgrade-from-release_suit_el7] 
> *07:25:59*
>  [upgrade-from-release_suit_el7]  Package  Arch   Version 
>Repository Size*07:25:59* [upgrade-from-release_suit_el7] 
> *07:25:59*
>  [upgrade-from-release_suit_el7] Installing:*07:25:59* 
> [upgrade-from-release_suit_el7]  python-ovirt-engine-sdk4 x86_64 
> 4.2.0-1.a0.20170511git210c375.el7.centos*07:25:59* 
> [upgrade-from-release_suit_el7]   
>   ovirt-master-snapshot 446 k*07:25:59* [upgrade-from-release_suit_el7] 
> Installing for dependencies:*07:25:59* [upgrade-from-release_suit_el7]  
> python-enum34noarch 1.0.4-1.el7centos-base-el752 
> k*07:25:59* [upgrade-from-release_suit_el7] *07:25:59* 
> [upgrade-from-release_suit_el7] Transaction Summary*07:25:59* 
> [upgrade-from-release_suit_el7] 
> 
>
>
> On Thu, May 11, 2017 at 8:35 PM, Anton Marchukov 
> wrote:
>
>> Hello Ondra.
>>
>> Thanks.
>>
>> It seems that the manual job populates SDK from custom repo only for the
>> VMs under test, but the mock where the python test code runs does not use
>> it from there. So the release of bumped version will be good idea.
>>
>> Anton.
>>
>> On Thu, May 11, 2017 at 8:20 PM, Ondra Machacek 
>> wrote:
>>
>>>
>>>
>>> On Thu, May 11, 2017 at 8:11 PM, Anton Marchukov 
>>> wrote:
>>>
 On Thu, May 11, 2017 at 8:03 PM, Ondra Machacek 
 wrote:

>
> *15:50:44* [basic_suit_el7] Updated:
>>
>> *15:50:44* [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64 
>> 0:4.2.0-1.a0.20170511git210c375.el7.centos
>>
>>
> This is incorrect version. The correct one is:
>
>  python-ovirt-engine-sdk4-4.2.0-1.a0.20170511gitcd0adb4.el7.
> centos.x86_64.rpm
> 
>
> From this build:
>
>  http://jenkins.ovirt.org/job/python-ovirt-engine-sdk4_maste
> r_build-artifacts-el7-x86_64/71/
>


 Sounds like we have a problem if the version different only by git
 hashes. They are not ordered.

 I suggest we just merge the version bump at https://gerrit.ovirt.org/#/
 c/76732/ and then see which version it will install.

 Any objections to that?

>>>
>>> OK, I will do a proper release.
>>>
>>>

 --
 Anton Marchukov
 Senior Software Engineer - RHEV CI - Red Hat


>>>
>>
>>
>> --
>> Anton Marchukov
>> Senior Software Engineer - RHEV CI - Red Hat
>>
>>
>


-- 
Anton Marchukov
Senior Software Engineer - RHEV CI - Red Hat
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] OST 4.1 failure: Error: ('Error while sending HTTP request', error('cannot add/remove handle - multi_perform() already running', ))

2017-05-12 Thread Ondra Machacek
Hello Anton,

So I've bumped the version, but it's still installing the old one.
The bumped version:


python-ovirt-engine-sdk4-4.2.0-1.a1.20170512git7c40be2.el7.centos.x86_64.rpm


Log from OST run:

*07:25:59* [upgrade-from-release_suit_el7]
*07:25:59*
[upgrade-from-release_suit_el7]  Package  Arch
VersionRepository Size*07:25:59*
[upgrade-from-release_suit_el7]
*07:25:59*
[upgrade-from-release_suit_el7] Installing:*07:25:59*
[upgrade-from-release_suit_el7]  python-ovirt-engine-sdk4 x86_64
4.2.0-1.a0.20170511git210c375.el7.centos*07:25:59*
[upgrade-from-release_suit_el7]
 ovirt-master-snapshot 446 k*07:25:59*
[upgrade-from-release_suit_el7] Installing for dependencies:*07:25:59*
[upgrade-from-release_suit_el7]  python-enum34noarch
1.0.4-1.el7centos-base-el752 k*07:25:59*
[upgrade-from-release_suit_el7] *07:25:59*
[upgrade-from-release_suit_el7] Transaction Summary*07:25:59*
[upgrade-from-release_suit_el7]



On Thu, May 11, 2017 at 8:35 PM, Anton Marchukov 
wrote:

> Hello Ondra.
>
> Thanks.
>
> It seems that the manual job populates SDK from custom repo only for the
> VMs under test, but the mock where the python test code runs does not use
> it from there. So the release of bumped version will be good idea.
>
> Anton.
>
> On Thu, May 11, 2017 at 8:20 PM, Ondra Machacek 
> wrote:
>
>>
>>
>> On Thu, May 11, 2017 at 8:11 PM, Anton Marchukov 
>> wrote:
>>
>>> On Thu, May 11, 2017 at 8:03 PM, Ondra Machacek 
>>> wrote:
>>>

 *15:50:44* [basic_suit_el7] Updated:
>
> *15:50:44* [basic_suit_el7]   python-ovirt-engine-sdk4.x86_64 
> 0:4.2.0-1.a0.20170511git210c375.el7.centos
>
>
 This is incorrect version. The correct one is:

  python-ovirt-engine-sdk4-4.2.0-1.a0.20170511gitcd0adb4.el7.
 centos.x86_64.rpm
 

 From this build:

  http://jenkins.ovirt.org/job/python-ovirt-engine-sdk4_maste
 r_build-artifacts-el7-x86_64/71/

>>>
>>>
>>> Sounds like we have a problem if the version different only by git
>>> hashes. They are not ordered.
>>>
>>> I suggest we just merge the version bump at https://gerrit.ovirt.org/#/
>>> c/76732/ and then see which version it will install.
>>>
>>> Any objections to that?
>>>
>>
>> OK, I will do a proper release.
>>
>>
>>>
>>> --
>>> Anton Marchukov
>>> Senior Software Engineer - RHEV CI - Red Hat
>>>
>>>
>>
>
>
> --
> Anton Marchukov
> Senior Software Engineer - RHEV CI - Red Hat
>
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] [ OST Failure Report ] [ oVirt master][12-05-2017][ 002_bootstrap.add_hosts ]

2017-05-12 Thread Sandro Bonazzola
On Fri, May 12, 2017 at 9:22 AM, Eyal Edri  wrote:

> It's a known issue with SDK, Anton is working on it with Ondra.
> See the other thread on devel with details.
>

Thanks, I've verified 4.1.2 rc2 to be released today is not affected.



>
> On May 12, 2017 9:10 AM, "Sandro Bonazzola"  wrote:
>
>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.
>> 1/1440/testReport/(root)/002_bootstrap/add_hosts/
>>
>> ('Error while sending HTTP request', error('cannot add/remove handle - 
>> multi_perform() already running',))
>>  >> begin captured logging << 
>> lago.utils: ERROR: Error while running thread
>> Traceback (most recent call last):
>>   File "/usr/lib/python2.7/site-packages/lago/utils.py", line 58, in 
>> _ret_via_queue
>> queue.put({'return': func()})
>>   File 
>> "/home/jenkins/workspace/test-repo_ovirt_experimental_4.1/ovirt-system-tests/basic-suite-4.1/test-scenarios/002_bootstrap.py",
>>  line 320, in _add_host_4
>> name=CLUSTER_NAME,
>>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 
>> 8726, in add
>> return self._internal_add(host, headers, query, wait)
>>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, 
>> in _internal_add
>> context = self._connection.send(request)
>>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 300, 
>> in send
>> sys.exc_info()[2]
>>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 295, 
>> in send
>> return self.__send(request)
>>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 413, 
>> in __send
>> self._multi.add_handle(curl)
>> Error: ('Error while sending HTTP request', error('cannot add/remove handle 
>> - multi_perform() already running',))
>> - >> end captured logging << -
>>
>>
>> Looks like it started failing on latest ovirt-host-deploy build:
>> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1431/
>> Sadly I can't find ovirt-host-deploy logs within the archived artifacts.
>>
>> Simone, Didi, can you please have a look?
>>
>> Ondra, can you also check? I'm not sure about the meaning of the error
>> reported by the SDK.
>>
>> --
>>
>> SANDRO BONAZZOLA
>>
>> ASSOCIATE MANAGER, SOFTWARE ENGINEERING, EMEA ENG VIRTUALIZATION R
>>
>> Red Hat EMEA 
>> 
>> TRIED. TESTED. TRUSTED. 
>>
>> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>


-- 

SANDRO BONAZZOLA

ASSOCIATE MANAGER, SOFTWARE ENGINEERING, EMEA ENG VIRTUALIZATION R

Red Hat EMEA 

TRIED. TESTED. TRUSTED. 
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] [ OST Failure Report ] [ oVirt master][12-05-2017][ 002_bootstrap.add_hosts ]

2017-05-12 Thread Eyal Edri
It's a known issue with SDK, Anton is working on it with Ondra.
See the other thread on devel with details.

On May 12, 2017 9:10 AM, "Sandro Bonazzola"  wrote:

> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_
> 4.1/1440/testReport/(root)/002_bootstrap/add_hosts/
>
> ('Error while sending HTTP request', error('cannot add/remove handle - 
> multi_perform() already running',))
>  >> begin captured logging << 
> lago.utils: ERROR: Error while running thread
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/site-packages/lago/utils.py", line 58, in 
> _ret_via_queue
> queue.put({'return': func()})
>   File 
> "/home/jenkins/workspace/test-repo_ovirt_experimental_4.1/ovirt-system-tests/basic-suite-4.1/test-scenarios/002_bootstrap.py",
>  line 320, in _add_host_4
> name=CLUSTER_NAME,
>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 8726, 
> in add
> return self._internal_add(host, headers, query, wait)
>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 211, 
> in _internal_add
> context = self._connection.send(request)
>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 300, 
> in send
> sys.exc_info()[2]
>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 295, 
> in send
> return self.__send(request)
>   File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py", line 413, 
> in __send
> self._multi.add_handle(curl)
> Error: ('Error while sending HTTP request', error('cannot add/remove handle - 
> multi_perform() already running',))
> - >> end captured logging << -
>
>
> Looks like it started failing on latest ovirt-host-deploy build:
> http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1431/
> Sadly I can't find ovirt-host-deploy logs within the archived artifacts.
>
> Simone, Didi, can you please have a look?
>
> Ondra, can you also check? I'm not sure about the meaning of the error
> reported by the SDK.
>
> --
>
> SANDRO BONAZZOLA
>
> ASSOCIATE MANAGER, SOFTWARE ENGINEERING, EMEA ENG VIRTUALIZATION R
>
> Red Hat EMEA 
> 
> TRIED. TESTED. TRUSTED. 
>
> ___
> Devel mailing list
> Devel@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Vdsm merge rights

2017-05-12 Thread Dan Kenigsberg
I'd like to nominate Francesco to the vdsm-maintainers
https://gerrit.ovirt.org/#/admin/groups/uuid-becbf722723417c336de6c1646749678acae8b89
list, so he can merge patches without waiting for Nir, Adam or me.

I believe that he proved to be thorough and considerate (and paranoid)
as the job requires.

Vdsm maintainers, please approve.

Dan
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel


[ovirt-devel] [ OST Failure Report ] [ oVirt master][12-05-2017][ 002_bootstrap.add_hosts ]

2017-05-12 Thread Sandro Bonazzola
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1440/testReport/(root)/002_bootstrap/add_hosts/

('Error while sending HTTP request', error('cannot add/remove handle -
multi_perform() already running',))
 >> begin captured logging << 
lago.utils: ERROR: Error while running thread
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/lago/utils.py", line 58, in
_ret_via_queue
queue.put({'return': func()})
  File 
"/home/jenkins/workspace/test-repo_ovirt_experimental_4.1/ovirt-system-tests/basic-suite-4.1/test-scenarios/002_bootstrap.py",
line 320, in _add_host_4
name=CLUSTER_NAME,
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py",
line 8726, in add
return self._internal_add(host, headers, query, wait)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line
211, in _internal_add
context = self._connection.send(request)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py",
line 300, in send
sys.exc_info()[2]
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py",
line 295, in send
return self.__send(request)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/__init__.py",
line 413, in __send
self._multi.add_handle(curl)
Error: ('Error while sending HTTP request', error('cannot add/remove
handle - multi_perform() already running',))
- >> end captured logging << -


Looks like it started failing on latest ovirt-host-deploy build:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1431/
Sadly I can't find ovirt-host-deploy logs within the archived artifacts.

Simone, Didi, can you please have a look?

Ondra, can you also check? I'm not sure about the meaning of the error
reported by the SDK.

-- 

SANDRO BONAZZOLA

ASSOCIATE MANAGER, SOFTWARE ENGINEERING, EMEA ENG VIRTUALIZATION R

Red Hat EMEA 

TRIED. TESTED. TRUSTED. 
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel