Re: [VDSM] vdsm_master_verify-error-codes_created running on master - why?

2016-07-19 Thread Dan Kenigsberg
On Wed, Jul 20, 2016 at 01:51:51AM +0300, Nir Soffer wrote:
> On Tue, Jul 19, 2016 at 6:20 PM, Eyal Edri  wrote:
> > And also, feel free to move it to check-patch.sh code as well.
> 
> Including this in vdsm seem like the best option.
> 
> Can you point me to the source of this job?
> 
> >
> > On Tue, Jul 19, 2016 at 6:19 PM, Eyal Edri  wrote:
> >>
> >> This isn't new, it was running for a few years, just on old jenkins,
> >> Maybe you just noticed it.
> >>
> >> Allon & Dan are familiar with that job and it already found in the past
> >> real issues.
> >> If you want to remove/disable it, I have no problem - just make sure
> >> you're synced with all VDSM people that requested this job in the first
> >> place.
> >>
> >> On Tue, Jul 19, 2016 at 6:02 PM, Nir Soffer  wrote:
> >>>
> >>> Hi all,
> >>>
> >>> Since yesterday, vdsm_master_verify-error-codes_created job is running
> >>> on master.
> >>>
> >>> I guess that this was a unintended change in jenkins - please revert this
> >>> change.
> >>>
> >>> If someone want to add a job for vdsm master, it must be approved by
> >>> vdsm maintainers first.
> >>>
> >>> The best would be to run everything from the automation scripts, so
> >>> vdsm maintainers have full control on the way patches are checked.

A bit of a background: this job was created many many years ago, in
order to compare the set of error codes in Vdsm to that of Engine. The
motivation was to catch typos or other mismatches, where Vdsm is sending
one value and Engine is expecting another, or Vdsm dropping something
that Engine depends on.

HOWEVER, I'm not sure at all that the job's code is up-to-date. I wonder
how it could have every survived the big changes of
https://gerrit.ovirt.org/#/c/48871/ and its bash code
http://jenkins.ovirt.org/job/vdsm_master_verify-error-codes_merged/configure
does not reassure me
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: [VDSM] vdsm_master_verify-error-codes_created running on master - why?

2016-07-19 Thread Eyal Edri
It wasn't yamelized until recently, so you can see the open patch for it at
[1].
I guess you can checkout the patch to jenkins repo and take the script from
there.
If you're moving it to vdsm repo, I'll abandon this patch, please let me
know.

[1] https://gerrit.ovirt.org/#/c/60630/2

On Wed, Jul 20, 2016 at 1:51 AM, Nir Soffer  wrote:

> On Tue, Jul 19, 2016 at 6:20 PM, Eyal Edri  wrote:
> > And also, feel free to move it to check-patch.sh code as well.
>
> Including this in vdsm seem like the best option.
>
> Can you point me to the source of this job?
>
> >
> > On Tue, Jul 19, 2016 at 6:19 PM, Eyal Edri  wrote:
> >>
> >> This isn't new, it was running for a few years, just on old jenkins,
> >> Maybe you just noticed it.
> >>
> >> Allon & Dan are familiar with that job and it already found in the past
> >> real issues.
> >> If you want to remove/disable it, I have no problem - just make sure
> >> you're synced with all VDSM people that requested this job in the first
> >> place.
> >>
> >> On Tue, Jul 19, 2016 at 6:02 PM, Nir Soffer  wrote:
> >>>
> >>> Hi all,
> >>>
> >>> Since yesterday, vdsm_master_verify-error-codes_created job is running
> >>> on master.
> >>>
> >>> I guess that this was a unintended change in jenkins - please revert
> this
> >>> change.
> >>>
> >>> If someone want to add a job for vdsm master, it must be approved by
> >>> vdsm maintainers first.
> >>>
> >>> The best would be to run everything from the automation scripts, so
> >>> vdsm maintainers have full control on the way patches are checked.
> >>>
> >>> Thanks,
> >>> Nir
> >>> ___
> >>> Infra mailing list
> >>> Infra@ovirt.org
> >>> http://lists.ovirt.org/mailman/listinfo/infra
> >>>
> >>>
> >>
> >>
> >>
> >> --
> >> Eyal Edri
> >> Associate Manager
> >> RHEV DevOps
> >> EMEA ENG Virtualization R&D
> >> Red Hat Israel
> >>
> >> phone: +972-9-7692018
> >> irc: eedri (on #tlv #rhev-dev #rhev-integ)
> >
> >
> >
> >
> > --
> > Eyal Edri
> > Associate Manager
> > RHEV DevOps
> > EMEA ENG Virtualization R&D
> > Red Hat Israel
> >
> > phone: +972-9-7692018
> > irc: eedri (on #tlv #rhev-dev #rhev-integ)
>



-- 
Eyal Edri
Associate Manager
RHEV DevOps
EMEA ENG Virtualization R&D
Red Hat Israel

phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-644) failed to write on /mnt/ramdisk/jenkins/workspace

2016-07-19 Thread eyal edri [Administrator] (oVirt JIRA)

 [ 
https://ovirt-jira.atlassian.net/browse/OVIRT-644?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

eyal edri [Administrator] updated OVIRT-644:

Resolution: Fixed
Status: Done  (was: To Do)

ovirt-srv26.phx.ovirt.org had a ramdrive defined but no cleaner.
I removed the ramdrive and returned the jenkins homedir to /.
We have another ticket to enable ram drives for phsyical slaves in a more 
stable and efficent way, there is another ticket on it.

slave is back online, closing.

> failed to write on /mnt/ramdisk/jenkins/workspace
> -
>
> Key: OVIRT-644
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-644
> Project: oVirt - virtualization made easy
>  Issue Type: By-EMAIL
>Reporter: sbonazzo
>Assignee: infra
>
> *From job: 
> **http://jenkins.ovirt.org/job/ovirt-engine_4.0_build-artifacts-el7-x86_64/236/console
> *
> *00:00:00.565* Building remotely on ovirt-srv26.phx.ovirt.org
> 
> (physical integ-tests) in workspace
> /mnt/ramdisk/jenkins/workspace/ovirt-engine_4.0_build-artifacts-el7-x86_64*00:00:00.585*
>  > git rev-parse --is-inside-work-tree # timeout=10*00:00:00.590*
> Fetching changes from the remote Git repository*00:00:00.593*  > git
> config remote.origin.url git://gerrit.ovirt.org/ovirt-engine.git #
> timeout=10*00:00:00.598* ERROR: Error fetching remote repo
> 'origin'*00:00:00.598* hudson.plugins.git.GitException
> :
> Failed to fetch from
> git://gerrit.ovirt.org/ovirt-engine.git*00:00:00.599* at
> hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:766)
> *00:00:00.599*
>   at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1022)
> *00:00:00.599*
>   at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1053)
> *00:00:00.599*
>   at 
> org.jenkinsci.plugins.multiplescms.MultiSCM.checkout(MultiSCM.java:129)
> *00:00:00.599*
>   at hudson.scm.SCM.checkout(SCM.java:485)
> *00:00:00.599*
>   at hudson.model.AbstractProject.checkout(AbstractProject.java:1269)
> *00:00:00.600*
>   at 
> hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:607)
> *00:00:00.600*
>   at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
> *00:00:00.600*
>   at 
> hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:529)
> *00:00:00.600*
>   at hudson.model.Run.execute(Run.java:1738)
> *00:00:00.600*
>   at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
> *00:00:00.600*
>   at hudson.model.ResourceController.execute(ResourceController.java:98)
> *00:00:00.600*
>   at hudson.model.Executor.run(Executor.java:410)
> *00:00:00.601*
> Caused by: hudson.plugins.git.GitException
> :
> Command "git config remote.origin.url
> git://gerrit.ovirt.org/ovirt-engine.git" returned status code
> 4:*00:00:00.601* stdout: *00:00:00.601* stderr: error: failed to write
> new configuration file
> /mnt/ramdisk/jenkins/workspace/ovirt-engine_4.0_build-artifacts-el7-x86_64/ovirt-engine/.git/config.lock
> -- 
> Sandro Bonazzola
> Better technology. Faster innovation. Powered by community collaboration.
> See how it works at redhat.com



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___

[JIRA] (OVIRT-645) proxy server not reliable

2016-07-19 Thread eyal edri [Administrator] (oVirt JIRA)

 [ 
https://ovirt-jira.atlassian.net/browse/OVIRT-645?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

eyal edri [Administrator] reassigned OVIRT-645:
---

Assignee: Evgheni Dereveanchin  (was: infra)

Evgheni,
Can you investigate why these errors happen?

> proxy server not reliable
> -
>
> Key: OVIRT-645
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-645
> Project: oVirt - virtualization made easy
>  Issue Type: By-EMAIL
>Reporter: sbonazzo
>Assignee: Evgheni Dereveanchin
>
> job:
> http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-artifacts-fc24-x86_64/8/
> DEBUG util.py:421:
> http://proxy.phx.ovirt.org:5000/centos-updates/7/x86_64/repodata/repomd.xml:
> [Errno 14] HTTP Error 500 - Internal Server Error
> it's happening quite often in several jobs. Can we make the proxy more
> reliable?
> -- 
> Sandro Bonazzola
> Better technology. Faster innovation. Powered by community collaboration.
> See how it works at redhat.com



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-641) Build oVirt 4.0.2 RC1

2016-07-19 Thread sbonazzo (oVirt JIRA)

 [ 
https://ovirt-jira.atlassian.net/browse/OVIRT-641?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

sbonazzo updated OVIRT-641:
---
Assignee: sbonazzo  (was: infra)
  Status: In Progress  (was: To Do)

> Build oVirt 4.0.2 RC1
> -
>
> Key: OVIRT-641
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-641
> Project: oVirt - virtualization made easy
>  Issue Type: Task
>  Components: Repositories Mgmt
>Reporter: sbonazzo
>Assignee: sbonazzo
>Priority: Highest
>




--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-645) proxy server not reliable

2016-07-19 Thread sbonazzo (oVirt JIRA)
sbonazzo created OVIRT-645:
--

 Summary: proxy server not reliable
 Key: OVIRT-645
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-645
 Project: oVirt - virtualization made easy
  Issue Type: By-EMAIL
Reporter: sbonazzo
Assignee: infra


job:
http://jenkins.ovirt.org/job/ovirt-node-ng_master_build-artifacts-fc24-x86_64/8/


DEBUG util.py:421:
http://proxy.phx.ovirt.org:5000/centos-updates/7/x86_64/repodata/repomd.xml:
[Errno 14] HTTP Error 500 - Internal Server Error

it's happening quite often in several jobs. Can we make the proxy more
reliable?


-- 
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-644) failed to write on /mnt/ramdisk/jenkins/workspace

2016-07-19 Thread sbonazzo (oVirt JIRA)
sbonazzo created OVIRT-644:
--

 Summary: failed to write on /mnt/ramdisk/jenkins/workspace
 Key: OVIRT-644
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-644
 Project: oVirt - virtualization made easy
  Issue Type: By-EMAIL
Reporter: sbonazzo
Assignee: infra


*From job: 
**http://jenkins.ovirt.org/job/ovirt-engine_4.0_build-artifacts-el7-x86_64/236/console
*



*00:00:00.565* Building remotely on ovirt-srv26.phx.ovirt.org

(physical integ-tests) in workspace
/mnt/ramdisk/jenkins/workspace/ovirt-engine_4.0_build-artifacts-el7-x86_64*00:00:00.585*
 > git rev-parse --is-inside-work-tree # timeout=10*00:00:00.590*
Fetching changes from the remote Git repository*00:00:00.593*  > git
config remote.origin.url git://gerrit.ovirt.org/ovirt-engine.git #
timeout=10*00:00:00.598* ERROR: Error fetching remote repo
'origin'*00:00:00.598* hudson.plugins.git.GitException
:
Failed to fetch from
git://gerrit.ovirt.org/ovirt-engine.git*00:00:00.599*   at
hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:766)
*00:00:00.599*
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:1022)
*00:00:00.599*
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1053)
*00:00:00.599*
at 
org.jenkinsci.plugins.multiplescms.MultiSCM.checkout(MultiSCM.java:129)
*00:00:00.599*
at hudson.scm.SCM.checkout(SCM.java:485)
*00:00:00.599*
at hudson.model.AbstractProject.checkout(AbstractProject.java:1269)
*00:00:00.600*
at 
hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:607)
*00:00:00.600*
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
*00:00:00.600*
at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:529)
*00:00:00.600*
at hudson.model.Run.execute(Run.java:1738)
*00:00:00.600*
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
*00:00:00.600*
at hudson.model.ResourceController.execute(ResourceController.java:98)
*00:00:00.600*
at hudson.model.Executor.run(Executor.java:410)
*00:00:00.601*
Caused by: hudson.plugins.git.GitException
:
Command "git config remote.origin.url
git://gerrit.ovirt.org/ovirt-engine.git" returned status code
4:*00:00:00.601* stdout: *00:00:00.601* stderr: error: failed to write
new configuration file
/mnt/ramdisk/jenkins/workspace/ovirt-engine_4.0_build-artifacts-el7-x86_64/ovirt-engine/.git/config.lock


-- 
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at redhat.com



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-4.0_el7_created - Build # 19 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_created/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_created/19/
Build Number: 19
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60822

-
Changes Since Last Success:
-
Changes for Build #19
[Martin Betak] frontend: Typesafe AsyncQuery & Converter




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-master_el7_created - Build # 19 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_created/
 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_created/19/
Build Number: 19
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60822

-
Changes Since Last Success:
-
Changes for Build #19
[Martin Betak] frontend: Typesafe AsyncQuery & Converter




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Build failed in Jenkins: ovirt_4.0_he-system-tests #37

2016-07-19 Thread jenkins
See 

Changes:

[Lev Veyde] ovirt-system-tests: Add automation for he_iscsi_basic_suite_4.0

--
[...truncated 681 lines...]
  |-  
  |-  
  |-   --== MISC CONFIGURATION ==--
  |-  
  |-   Please choose Data Warehouse sampling scale:
  |-   (1) Basic
  |-   (2) Full
  |-   (1, 2)[1]: 
  |-  
  |-   --== END OF CONFIGURATION ==--
  |-  
  |- [ INFO  ] Stage: Setup validation
  |- [WARNING] Cannot validate host name settings, reason: cannot 
resolve own name 'hosted-engine'
  |- [WARNING] Warning: Not enough memory is available on the host. 
Minimum requirement is 4096MB, and 16384MB is recommended.
  |-  
  |-   --== CONFIGURATION PREVIEW ==--
  |-  
  |-   Application mode: both
  |-   Default SAN wipe after delete   : False
  |-   Firewall manager: firewalld
  |-   Update Firewall : True
  |-   Host FQDN   : 
hosted-engine.lago.local
  |-   Engine database secured connection  : False
  |-   Engine database host: localhost
  |-   Engine database user name   : engine
  |-   Engine database name: engine
  |-   Engine database port: 5432
  |-   Engine database host name validation: False
  |-   DWH database secured connection : False
  |-   DWH database host   : localhost
  |-   DWH database user name  : 
ovirt_engine_history
  |-   DWH database name   : 
ovirt_engine_history
  |-   DWH database port   : 5432
  |-   DWH database host name validation   : False
  |-   Engine installation : True
  |-   PKI organization: lago.local
  |-   Configure local Engine database : True
  |-   Set application as default page : True
  |-   Configure Apache SSL: True
  |-   DWH installation: True
  |-   Configure local DWH database: True
  |-   Engine Host FQDN: 
hosted-engine.lago.local
  |-   Configure VMConsole Proxy   : True
  |-   Configure WebSocket Proxy   : True
  |- [ INFO  ] Stage: Transaction setup
  |- [ INFO  ] Stopping engine service
  |- [ INFO  ] Stopping ovirt-fence-kdump-listener service
  |- [ INFO  ] Stopping dwh service
  |- [ INFO  ] Stopping websocket-proxy service
  |- [ INFO  ] Stage: Misc configuration
  |- [ INFO  ] Stage: Package installation
  |- [ INFO  ] Stage: Misc configuration
  |- [ INFO  ] Upgrading CA
  |- [ INFO  ] Initializing PostgreSQL
  |- [ INFO  ] Creating PostgreSQL 'engine' database
  |- [ INFO  ] Configuring PostgreSQL
  |- [ INFO  ] Creating PostgreSQL 'ovirt_engine_history' database
  |- [ INFO  ] Configuring PostgreSQL
  |- [ INFO  ] Creating CA
  |- [ INFO  ] Creating/refreshing Engine database schema
Build timed out (after 360 minutes). Marking the build as failed.
Build was aborted
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  : #!/bin/bash -xe
echo 'shell_scripts/system_tests.collect_logs.sh'

#
# Required jjb vars:
#version
#
VERSION=4.0
SUITE_TYPE=

WORKSPACE="$PWD"
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"

rm -rf "$WORKSPACE/exported-artifacts"
mkdir -p "$WORKSPACE/exported-artifacts"

if [[ -d "$TESTS_LOGS" ]]; then
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
fi

[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson301786022997325798.sh
+ echo shell_scripts/system_tests.collect_logs.sh
shell_scripts/system_tests.collect_logs.sh
+ VERSION=4.0
+ SUITE_TYPE=
+ WORKSPACE=
+ OVIRT_SUITE=4.0
+ 
TESTS_LOGS=
+ rm -rf 

+ mkdir -p 


RHEVM CI Jenkins daily report - 19

2016-07-19 Thread jenkins
Good morning!

Attached is the HTML page with the jenkins status report. You can see it also 
here:
 - 
http://jenkins.ovirt.org/job/system_jenkins-report/19//artifact/exported-artifacts/upstream_report.html

Cheers,
Jenkins


upstream_report.html
Description: Binary data
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: [VDSM] vdsm_master_verify-error-codes_created running on master - why?

2016-07-19 Thread Nir Soffer
On Tue, Jul 19, 2016 at 6:20 PM, Eyal Edri  wrote:
> And also, feel free to move it to check-patch.sh code as well.

Including this in vdsm seem like the best option.

Can you point me to the source of this job?

>
> On Tue, Jul 19, 2016 at 6:19 PM, Eyal Edri  wrote:
>>
>> This isn't new, it was running for a few years, just on old jenkins,
>> Maybe you just noticed it.
>>
>> Allon & Dan are familiar with that job and it already found in the past
>> real issues.
>> If you want to remove/disable it, I have no problem - just make sure
>> you're synced with all VDSM people that requested this job in the first
>> place.
>>
>> On Tue, Jul 19, 2016 at 6:02 PM, Nir Soffer  wrote:
>>>
>>> Hi all,
>>>
>>> Since yesterday, vdsm_master_verify-error-codes_created job is running
>>> on master.
>>>
>>> I guess that this was a unintended change in jenkins - please revert this
>>> change.
>>>
>>> If someone want to add a job for vdsm master, it must be approved by
>>> vdsm maintainers first.
>>>
>>> The best would be to run everything from the automation scripts, so
>>> vdsm maintainers have full control on the way patches are checked.
>>>
>>> Thanks,
>>> Nir
>>> ___
>>> Infra mailing list
>>> Infra@ovirt.org
>>> http://lists.ovirt.org/mailman/listinfo/infra
>>>
>>>
>>
>>
>>
>> --
>> Eyal Edri
>> Associate Manager
>> RHEV DevOps
>> EMEA ENG Virtualization R&D
>> Red Hat Israel
>>
>> phone: +972-9-7692018
>> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>
>
>
>
> --
> Eyal Edri
> Associate Manager
> RHEV DevOps
> EMEA ENG Virtualization R&D
> Red Hat Israel
>
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: Vdsm 4.0 fc23 build fails with "nothing provides ovirt-imageio-common"

2016-07-19 Thread Nir Soffer
More info - this is a random failure - other patches in same topic are fine.

So it seems that some slaves have wrong repositories, maybe cache issue?

On Tue, Jul 19, 2016 at 7:09 PM, Nir Soffer  wrote:
> Hi all,
>
> Seems that builds on 4.0 are failing now with:
> 15:19:02 Error: nothing provides ovirt-imageio-common needed by
> vdsm-4.18.6-13.git3aaee18.fc23.x86_64.
>
> See http://jenkins.ovirt.org/job/vdsm_4.0_check-patch-fc23-x86_64/27/console
>
> ovirt-imageio-* packages are built in jenkins, and provided in ovirt
> repositories.
>
> Can someone take a look?
>
> Nir
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Vdsm 4.0 fc23 build fails with "nothing provides ovirt-imageio-common"

2016-07-19 Thread Nir Soffer
Hi all,

Seems that builds on 4.0 are failing now with:
15:19:02 Error: nothing provides ovirt-imageio-common needed by
vdsm-4.18.6-13.git3aaee18.fc23.x86_64.

See http://jenkins.ovirt.org/job/vdsm_4.0_check-patch-fc23-x86_64/27/console

ovirt-imageio-* packages are built in jenkins, and provided in ovirt
repositories.

Can someone take a look?

Nir
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: [VDSM] vdsm_master_verify-error-codes_created running on master - why?

2016-07-19 Thread Eyal Edri
And also, feel free to move it to check-patch.sh code as well.

On Tue, Jul 19, 2016 at 6:19 PM, Eyal Edri  wrote:

> This isn't new, it was running for a few years, just on old jenkins,
> Maybe you just noticed it.
>
> Allon & Dan are familiar with that job and it already found in the past
> real issues.
> If you want to remove/disable it, I have no problem - just make sure
> you're synced with all VDSM people that requested this job in the first
> place.
>
> On Tue, Jul 19, 2016 at 6:02 PM, Nir Soffer  wrote:
>
>> Hi all,
>>
>> Since yesterday, vdsm_master_verify-error-codes_created job is running
>> on master.
>>
>> I guess that this was a unintended change in jenkins - please revert this
>> change.
>>
>> If someone want to add a job for vdsm master, it must be approved by
>> vdsm maintainers first.
>>
>> The best would be to run everything from the automation scripts, so
>> vdsm maintainers have full control on the way patches are checked.
>>
>> Thanks,
>> Nir
>> ___
>> Infra mailing list
>> Infra@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/infra
>>
>>
>>
>
>
> --
> Eyal Edri
> Associate Manager
> RHEV DevOps
> EMEA ENG Virtualization R&D
> Red Hat Israel
>
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>



-- 
Eyal Edri
Associate Manager
RHEV DevOps
EMEA ENG Virtualization R&D
Red Hat Israel

phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: [VDSM] vdsm_master_verify-error-codes_created running on master - why?

2016-07-19 Thread Eyal Edri
This isn't new, it was running for a few years, just on old jenkins,
Maybe you just noticed it.

Allon & Dan are familiar with that job and it already found in the past
real issues.
If you want to remove/disable it, I have no problem - just make sure you're
synced with all VDSM people that requested this job in the first place.

On Tue, Jul 19, 2016 at 6:02 PM, Nir Soffer  wrote:

> Hi all,
>
> Since yesterday, vdsm_master_verify-error-codes_created job is running
> on master.
>
> I guess that this was a unintended change in jenkins - please revert this
> change.
>
> If someone want to add a job for vdsm master, it must be approved by
> vdsm maintainers first.
>
> The best would be to run everything from the automation scripts, so
> vdsm maintainers have full control on the way patches are checked.
>
> Thanks,
> Nir
> ___
> Infra mailing list
> Infra@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/infra
>
>
>


-- 
Eyal Edri
Associate Manager
RHEV DevOps
EMEA ENG Virtualization R&D
Red Hat Israel

phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-636) Add disks in memory for lago slaves

2016-07-19 Thread Barak Korren (oVirt JIRA)

[ 
https://ovirt-jira.atlassian.net/browse/OVIRT-636?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18806#comment-18806
 ] 

Barak Korren commented on OVIRT-636:


[~ederevea] AFAIK it will just empty the cache. 

Please take a look at '/etc/sysconfig/readonly-root' and '/etc/rwtab'. They 
seem to do exactly what we need without having to add our own unit files. They 
also add the interesting possibility to move things from disk to RAM on boot. 
While this is not mandatory, it will be useful for saving cache warm-up times.

> Add disks in memory for lago slaves 
> 
>
> Key: OVIRT-636
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-636
> Project: oVirt - virtualization made easy
>  Issue Type: Bug
>Reporter: eyal edri [Administrator]
>Assignee: infra
>
> We need to try moving the Lago slaves in CI to work with memory instead of 
> local disks,
> It might speed up the tests significantly.
> We have 2 choices:
> 1. add code to the job and mount the dir on /dev/shm/
> 2. create zram drive and mount the following dirs on it:
> # /var/lib/mock
> # /var/lib/lago
> # /home/jenkins
> Each host should have enough memory to run current tests on them,
> however we'll need to make sure to clean that drive after each run.
> Lets try it on one of the slaves to see what is the best solution before 
> implementing for all slaves.



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-636) Add disks in memory for lago slaves

2016-07-19 Thread Evgheni Dereveanchin (oVirt JIRA)

[ 
https://ovirt-jira.atlassian.net/browse/OVIRT-636?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18805#comment-18805
 ] 

Evgheni Dereveanchin commented on OVIRT-636:


We can base this on unit files created by Anton to partition and mount local 
disks on slave VMs:
https://gerrit.ovirt.org/#/c/60036/

[~bkor...@redhat.com] - what happens if /var/lib/lago is empty after host 
reboot? Will this just empty the cache or it may negatively affect the slave? I 
don't think bare metal slaves will reboot that often, so wondering if we 
actually need to sync this directory to disk.

> Add disks in memory for lago slaves 
> 
>
> Key: OVIRT-636
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-636
> Project: oVirt - virtualization made easy
>  Issue Type: Bug
>Reporter: eyal edri [Administrator]
>Assignee: infra
>
> We need to try moving the Lago slaves in CI to work with memory instead of 
> local disks,
> It might speed up the tests significantly.
> We have 2 choices:
> 1. add code to the job and mount the dir on /dev/shm/
> 2. create zram drive and mount the following dirs on it:
> # /var/lib/mock
> # /var/lib/lago
> # /home/jenkins
> Each host should have enough memory to run current tests on them,
> however we'll need to make sure to clean that drive after each run.
> Lets try it on one of the slaves to see what is the best solution before 
> implementing for all slaves.



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[VDSM] vdsm_master_verify-error-codes_created running on master - why?

2016-07-19 Thread Nir Soffer
Hi all,

Since yesterday, vdsm_master_verify-error-codes_created job is running
on master.

I guess that this was a unintended change in jenkins - please revert this
change.

If someone want to add a job for vdsm master, it must be approved by
vdsm maintainers first.

The best would be to run everything from the automation scripts, so
vdsm maintainers have full control on the way patches are checked.

Thanks,
Nir
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-master_el7_created - Build # 10 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_created/
 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_created/10/
Build Number: 10
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/61042

-
Changes Since Last Success:
-


-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-643) add jira.ovirt.org redirect

2016-07-19 Thread Evgheni Dereveanchin (oVirt JIRA)

 [ 
https://ovirt-jira.atlassian.net/browse/OVIRT-643?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Evgheni Dereveanchin reassigned OVIRT-643:
--

Assignee: Evgheni Dereveanchin  (was: infra)

> add jira.ovirt.org redirect
> ---
>
> Key: OVIRT-643
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-643
> Project: oVirt - virtualization made easy
>  Issue Type: Task
>Reporter: Evgheni Dereveanchin
>Assignee: Evgheni Dereveanchin
>Priority: Low
>
> Simplify the JIRA url. As it's not possible to change URL completely [1] 
> we'll just add an HTTP redirect like jira.ovirt.org -> 
> ovirt-jira.atlassian.net
> [1] https://confluence.atlassian.com/cloudkb/zzz-691011835.html



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-4.0_el7_created - Build # 10 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_created/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_created/10/
Build Number: 10
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/61042

-
Changes Since Last Success:
-


-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-643) add jira.ovirt.org redirect

2016-07-19 Thread Evgheni Dereveanchin (oVirt JIRA)
Evgheni Dereveanchin created OVIRT-643:
--

 Summary: add jira.ovirt.org redirect
 Key: OVIRT-643
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-643
 Project: oVirt - virtualization made easy
  Issue Type: Task
Reporter: Evgheni Dereveanchin
Assignee: infra
Priority: Low


Simplify the JIRA url. As it's not possible to change URL completely [1] we'll 
just add an HTTP redirect like jira.ovirt.org -> ovirt-jira.atlassian.net

[1] https://confluence.atlassian.com/cloudkb/zzz-691011835.html



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: Automation moving bug to modified when it is not

2016-07-19 Thread Michal Skrivanek

> On 19 Jul 2016, at 14:25, Eyal Edri  wrote:
> 
> 
> 
> On Tue, Jul 19, 2016 at 1:37 PM, Michal Skrivanek  > wrote:
> 
>> On 19 Jul 2016, at 12:15, Eyal Edri > > wrote:
>> 
>> This happened because you used the same bug for master and 4.0.
> 
> right, but that’s the regular state of things because upstream there is no 
> separate bug for master vs current version
> 
>> The Gerrit hook doesn't verify status between major versions, only inside a 
>> single version (for e.g, it would not move to modified if you needed to 
>> backport to 4.0.1 and the target milestone was 4.0.1).
>> I'm not sure how we can tackle this, because master has no meaning in 
>> bugzilla, it doesn't correlate to a version.
>> 
>> One think I can think of, is NOT to move bugs to MODIFIED is a patch was 
>> merged on master branch... , will that help? 
> 
> I’m not sure if it’s better because before 4.1 is branched the master 
> development is for 4.1 bugs.
> It would make sense to differentiate based on whether a branch for that TM 
> version exists or not, so in your above example since the bug has TM 4.0.x 
> and there is a 4.0 branch it would wait for a backport
> 
> I can't compare it to 4.0 because master is a moving target, so this hook 
> will misbehave once master change versions, I need a solid logic that will 
> work all the time for bugs on master.
> either not move them to MODIFIED if the bug is on target milestone != master 
> (which is probably 100% of the times) or some regex we can use... I don't 
> have any other creative ideas…

I guess if we have TM as x.y.z and the projects have x.y branch we can check 
for that, right? if the branch is not there then master is the final branch; if 
TM x.y.z matches some ovirt-x.y branch the backport is needed.

> You can look at the code if you want at [1] and see if you have an idea.
> 
> [1] 
> https://gerrit.ovirt.org/gitweb?p=gerrit-admin.git;a=blob;f=hooks/custom_hooks/change-merged.set_MODIFIED;h=678806dc35a372dadab5a5a392d25409db5c8275;hb=refs/heads/master
>  
> 
>  
> 
> Thanks,
> michal
> 
>> 
>> 
>> On Tue, Jul 19, 2016 at 8:07 AM, Michal Skrivanek > > wrote:
>> Example in bug https://bugzilla.redhat.com/show_bug.cgi?id=1357440 
>> 
>> It doesn't take into account branches
>> 
>> Thanks,
>> michal
>> 
>> 
>> 
>> -- 
>> Eyal Edri
>> Associate Manager
>> RHEV DevOps
>> EMEA ENG Virtualization R&D
>> Red Hat Israel
>> 
>> phone: +972-9-7692018 
>> irc: eedri (on #tlv #rhev-dev #rhev-integ)
> 
> 
> 
> 
> -- 
> Eyal Edri
> Associate Manager
> RHEV DevOps
> EMEA ENG Virtualization R&D
> Red Hat Israel
> 
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-642) cleanup VM templates in PHX ovirt instance

2016-07-19 Thread Evgheni Dereveanchin (oVirt JIRA)
Evgheni Dereveanchin created OVIRT-642:
--

 Summary: cleanup VM templates in PHX ovirt instance
 Key: OVIRT-642
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-642
 Project: oVirt - virtualization made easy
  Issue Type: Task
Reporter: Evgheni Dereveanchin
Assignee: infra
Priority: Low


There's a lot of unused templates in the oVirt instance. Outdated ones should 
be removed.



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[JIRA] (OVIRT-642) cleanup VM templates in PHX ovirt instance

2016-07-19 Thread Evgheni Dereveanchin (oVirt JIRA)

 [ 
https://ovirt-jira.atlassian.net/browse/OVIRT-642?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Evgheni Dereveanchin reassigned OVIRT-642:
--

Assignee: Evgheni Dereveanchin  (was: infra)

> cleanup VM templates in PHX ovirt instance
> --
>
> Key: OVIRT-642
> URL: https://ovirt-jira.atlassian.net/browse/OVIRT-642
> Project: oVirt - virtualization made easy
>  Issue Type: Task
>Reporter: Evgheni Dereveanchin
>Assignee: Evgheni Dereveanchin
>Priority: Low
>
> There's a lot of unused templates in the oVirt instance. Outdated ones should 
> be removed.



--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: Automation moving bug to modified when it is not

2016-07-19 Thread Eyal Edri
On Tue, Jul 19, 2016 at 1:37 PM, Michal Skrivanek 
wrote:

>
> On 19 Jul 2016, at 12:15, Eyal Edri  wrote:
>
> This happened because you used the same bug for master and 4.0.
>
>
> right, but that’s the regular state of things because upstream there is no
> separate bug for master vs current version
>
> The Gerrit hook doesn't verify status between major versions, only inside
> a single version (for e.g, it would not move to modified if you needed to
> backport to 4.0.1 and the target milestone was 4.0.1).
> I'm not sure how we can tackle this, because master has no meaning in
> bugzilla, it doesn't correlate to a version.
>
> One think I can think of, is NOT to move bugs to MODIFIED is a patch was
> merged on master branch... , will that help?
>
>
> I’m not sure if it’s better because before 4.1 is branched the master
> development is for 4.1 bugs.
> It would make sense to differentiate based on whether a branch for that TM
> version exists or not, so in your above example since the bug has TM 4.0.x
> and there is a 4.0 branch it would wait for a backport
>

I can't compare it to 4.0 because master is a moving target, so this hook
will misbehave once master change versions, I need a solid logic that will
work all the time for bugs on master.
either not move them to MODIFIED if the bug is on target milestone !=
master (which is probably 100% of the times) or some regex we can use... I
don't have any other creative ideas...
You can look at the code if you want at [1] and see if you have an idea.

[1]
https://gerrit.ovirt.org/gitweb?p=gerrit-admin.git;a=blob;f=hooks/custom_hooks/change-merged.set_MODIFIED;h=678806dc35a372dadab5a5a392d25409db5c8275;hb=refs/heads/master


>
> Thanks,
> michal
>
>
>
> On Tue, Jul 19, 2016 at 8:07 AM, Michal Skrivanek 
> wrote:
>
>> Example in bug https://bugzilla.redhat.com/show_bug.cgi?id=1357440
>> It doesn't take into account branches
>>
>> Thanks,
>> michal
>>
>
>
>
> --
> Eyal Edri
> Associate Manager
> RHEV DevOps
> EMEA ENG Virtualization R&D
> Red Hat Israel
>
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>
>
>


-- 
Eyal Edri
Associate Manager
RHEV DevOps
EMEA ENG Virtualization R&D
Red Hat Israel

phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-4.0_el7_merged - Build # 611 - Still Failing!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/611/
Build Number: 611
Build Status:  Still Failing
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60968

-
Changes Since Last Success:
-
Changes for Build #607
[Yevgeny Zaspitsky] engine: remove unsused code from VmNicValidator


Changes for Build #608
[Martin Mucha] core: after HostNetworkQos is removed, sync influenced networks.


Changes for Build #609
[Jakub Niedermertl] webadmin: Guest Info subtab name token fixed


Changes for Build #610
[Tal Nisan] webadmin: Fix UI exception after creating new profile


Changes for Build #611
[Sharon Gratch] core:  fix a typo in log message for export/import




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-master_el7_merged - Build # 1069 - Still Failing!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_merged/
 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_merged/1069/
Build Number: 1069
Build Status:  Still Failing
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60968

-
Changes Since Last Success:
-
Changes for Build #1066
[Martin Mucha] core: after HostNetworkQos is removed, sync influenced networks.


Changes for Build #1067
[Jakub Niedermertl] webadmin: Guest Info subtab name token fixed


Changes for Build #1068
[Tal Nisan] webadmin: Fix UI exception after creating new profile


Changes for Build #1069
[Sharon Gratch] core:  fix a typo in log message for export/import




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-master_el7_merged - Build # 1068 - Still Failing!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_merged/
 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_merged/1068/
Build Number: 1068
Build Status:  Still Failing
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60975

-
Changes Since Last Success:
-
Changes for Build #1066
[Martin Mucha] core: after HostNetworkQos is removed, sync influenced networks.


Changes for Build #1067
[Jakub Niedermertl] webadmin: Guest Info subtab name token fixed


Changes for Build #1068
[Tal Nisan] webadmin: Fix UI exception after creating new profile




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-4.0_el7_merged - Build # 610 - Still Failing!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/610/
Build Number: 610
Build Status:  Still Failing
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60975

-
Changes Since Last Success:
-
Changes for Build #607
[Yevgeny Zaspitsky] engine: remove unsused code from VmNicValidator


Changes for Build #608
[Martin Mucha] core: after HostNetworkQos is removed, sync influenced networks.


Changes for Build #609
[Jakub Niedermertl] webadmin: Guest Info subtab name token fixed


Changes for Build #610
[Tal Nisan] webadmin: Fix UI exception after creating new profile




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: premissions for my jenkins user

2016-07-19 Thread Gil Shinar
Try now

On Tue, Jul 19, 2016 at 12:42 PM, Eyal Edri  wrote:

> Shlomi,
> Can you add ido to the dev role so he can trigger builds?
>
> On Jul 19, 2016 12:20 PM, "Ido Rosenzwig"  wrote:
>
>> Hi,
>>
>> I wish to have trigger (and re-trigger) premissions on my jenkins user.
>> my user is : irosenzw
>>
>> Best regards,
>> Ido Rosenzwig
>>
>> ___
>> Infra mailing list
>> Infra@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/infra
>>
>>
> ___
> Infra mailing list
> Infra@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/infra
>
>
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-master_el7_merged - Build # 1067 - Still Failing!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_merged/
 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_merged/1067/
Build Number: 1067
Build Status:  Still Failing
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60976

-
Changes Since Last Success:
-
Changes for Build #1066
[Martin Mucha] core: after HostNetworkQos is removed, sync influenced networks.


Changes for Build #1067
[Jakub Niedermertl] webadmin: Guest Info subtab name token fixed




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-4.0_el7_merged - Build # 609 - Still Failing!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/609/
Build Number: 609
Build Status:  Still Failing
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60976

-
Changes Since Last Success:
-
Changes for Build #607
[Yevgeny Zaspitsky] engine: remove unsused code from VmNicValidator


Changes for Build #608
[Martin Mucha] core: after HostNetworkQos is removed, sync influenced networks.


Changes for Build #609
[Jakub Niedermertl] webadmin: Guest Info subtab name token fixed




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: Automation moving bug to modified when it is not

2016-07-19 Thread Michal Skrivanek

> On 19 Jul 2016, at 12:15, Eyal Edri  wrote:
> 
> This happened because you used the same bug for master and 4.0.

right, but that’s the regular state of things because upstream there is no 
separate bug for master vs current version

> The Gerrit hook doesn't verify status between major versions, only inside a 
> single version (for e.g, it would not move to modified if you needed to 
> backport to 4.0.1 and the target milestone was 4.0.1).
> I'm not sure how we can tackle this, because master has no meaning in 
> bugzilla, it doesn't correlate to a version.
> 
> One think I can think of, is NOT to move bugs to MODIFIED is a patch was 
> merged on master branch... , will that help? 

I’m not sure if it’s better because before 4.1 is branched the master 
development is for 4.1 bugs.
It would make sense to differentiate based on whether a branch for that TM 
version exists or not, so in your above example since the bug has TM 4.0.x and 
there is a 4.0 branch it would wait for a backport

Thanks,
michal

> 
> 
> On Tue, Jul 19, 2016 at 8:07 AM, Michal Skrivanek  > wrote:
> Example in bug https://bugzilla.redhat.com/show_bug.cgi?id=1357440 
> 
> It doesn't take into account branches
> 
> Thanks,
> michal
> 
> 
> 
> -- 
> Eyal Edri
> Associate Manager
> RHEV DevOps
> EMEA ENG Virtualization R&D
> Red Hat Israel
> 
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-4.0_el7_merged - Build # 607 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/607/
Build Number: 607
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60929

-
Changes Since Last Success:
-
Changes for Build #607
[Yevgeny Zaspitsky] engine: remove unsused code from VmNicValidator




-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: Automation moving bug to modified when it is not

2016-07-19 Thread Eyal Edri
This happened because you used the same bug for master and 4.0.
The Gerrit hook doesn't verify status between major versions, only inside a
single version (for e.g, it would not move to modified if you needed to
backport to 4.0.1 and the target milestone was 4.0.1).
I'm not sure how we can tackle this, because master has no meaning in
bugzilla, it doesn't correlate to a version.

One think I can think of, is NOT to move bugs to MODIFIED is a patch was
merged on master branch... , will that help?


On Tue, Jul 19, 2016 at 8:07 AM, Michal Skrivanek 
wrote:

> Example in bug https://bugzilla.redhat.com/show_bug.cgi?id=1357440
> It doesn't take into account branches
>
> Thanks,
> michal
>



-- 
Eyal Edri
Associate Manager
RHEV DevOps
EMEA ENG Virtualization R&D
Red Hat Israel

phone: +972-9-7692018
irc: eedri (on #tlv #rhev-dev #rhev-integ)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-master_el7_merged - Build # 1066 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_merged/
 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-master_el7_merged/1066/
Build Number: 1066
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/59310

-
Changes Since Last Success:
-


-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-4.0_el7_merged - Build # 608 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/608/
Build Number: 608
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/59310

-
Changes Since Last Success:
-


-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: premissions for my jenkins user

2016-07-19 Thread Eyal Edri
Shlomi,
Can you add ido to the dev role so he can trigger builds?

On Jul 19, 2016 12:20 PM, "Ido Rosenzwig"  wrote:

> Hi,
>
> I wish to have trigger (and re-trigger) premissions on my jenkins user.
> my user is : irosenzw
>
> Best regards,
> Ido Rosenzwig
>
> ___
> Infra mailing list
> Infra@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/infra
>
>
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_master_upgrade-from-4.0_el7_merged - Build # 604 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_master_upgrade-from-4.0_el7_merged/604/
Build Number: 604
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/60923

-
Changes Since Last Success:
-


-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


premissions for my jenkins user

2016-07-19 Thread Ido Rosenzwig
Hi,

I wish to have trigger (and re-trigger) premissions on my jenkins user.
my user is : irosenzw

Best regards,
Ido Rosenzwig
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


[oVirt Jenkins] ovirt-engine_4.0_upgrade-from-3.6_el7_merged - Build # 315 - Failure!

2016-07-19 Thread jenkins
Project: 
http://jenkins.ovirt.org/job/ovirt-engine_4.0_upgrade-from-3.6_el7_merged/ 
Build: 
http://jenkins.ovirt.org/job/ovirt-engine_4.0_upgrade-from-3.6_el7_merged/315/
Build Number: 315
Build Status:  Failure
Triggered By: Triggered by Gerrit: https://gerrit.ovirt.org/58286

-
Changes Since Last Success:
-


-
Failed Tests:
-
No tests ran. 

___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32

2016-07-19 Thread Eyal Edri
we have it for all versions:
http://resources.ovirt.org/repos/ovirt/experimental/

On Tue, Jul 19, 2016 at 11:27 AM, Tolik Litovsky 
wrote:

> Is it only for master job ?
> Or we have such repos for all branches ?
>
> On Tue, Jul 19, 2016 at 10:13 AM, Eyal Edri  wrote:
>
>> Ryan/Tolik,
>> Can you build appliance only from tested engine repo [1] ? lets see how
>> it affects the stability, next step will be to publish tested appliance
>> after it runs Lago verification.
>>
>> [1]
>> http://resources.ovirt.org/repos/ovirt/experimental/master/latest.tested/
>> (published only after ovirt-system-tests basic suite finish successfully)
>>
>>
>> On Tue, Jul 19, 2016 at 10:10 AM, Lev Veyde  wrote:
>>
>>> Hi Eyal,
>>>
>>> The last failed run failed on:
>>> *15:50:02* [ INFO ] Extracting disk image from OVF archive (could take
>>> a few minutes depending on archive size)
>>> *21:35:04* Build timed out (after 360 minutes). Marking the build as
>>> failed.
>>>
>>> So it basically got stuck while extracting the OVF image.
>>>
>>> Some previous runs failed mostly on either:
>>> a) broken ovirt-engine-appliance build
>>> b) ovirt-engine-appliance missing from the yum repo
>>>
>>> We need to make sure that the process of building and publishing the
>>> ovirt-engine-appliance works flawlessly e.g. build ovirt-engine, publish it
>>> into the repo so that the build of the appliance can work, then publish it
>>> to the repo as well.
>>> This is extra important as the hosted-engine flow installation will
>>> probably become the default one, and without synced ovirt appliance we
>>> can't really test the changes in the engine.
>>>
>>> Thanks in advance,
>>> Lev Veyde.
>>>
>>> --
>>> *From: *"Eyal Edri" 
>>> *To: *jenk...@jenkins.phx.ovirt.org
>>> *Cc: *"infra" , "Lev Veyde" ,
>>> sbona...@redhat.com
>>> *Sent: *Tuesday, July 19, 2016 8:26:22 AM
>>> *Subject: *Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32
>>>
>>>
>>> Lev,  this test is a bit flaky going from stable to failure quite
>>> often,  can you check what is causing it?
>>> On Jul 19, 2016 12:35 AM,  wrote:
>>>
 See 

 Changes:

 [Lev Veyde] ovirt-system-tests: Add automation for
 he_iscsi_basic_suite_4.0

 [Sandro Bonazzola] vdsm: avoid fc24 out of master

 [Sandro Bonazzola] ovirt-engine: add 3.6.8 branch testing

 --
 [...truncated 620 lines...]

 WORKSPACE="$PWD"
 OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
 TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"

 rm -rf "$WORKSPACE/exported-artifacts"
 mkdir -p "$WORKSPACE/exported-artifacts"

 if [[ -d "$TESTS_LOGS" ]]; then
 mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
 fi

 [ovirt_4.0_he-system-tests] $ /bin/bash -xe
 /tmp/hudson1764906258788527221.sh
 + echo shell_scripts/system_tests.collect_logs.sh
 shell_scripts/system_tests.collect_logs.sh
 + VERSION=4.0
 + SUITE_TYPE=
 + WORKSPACE=
 + OVIRT_SUITE=4.0
 + TESTS_LOGS=<
 http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts
 >
 + rm -rf <
 http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/exported-artifacts
 >
 + mkdir -p <
 http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/exported-artifacts
 >
 + [[ -d <
 http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts>
 ]]
 POST BUILD TASK : SUCCESS
 END OF POST BUILD TASK : 0
 Match found for :.* : True
 Logical operation result is TRUE
 Running script  : #!/bin/bash -xe
 echo "shell-scripts/mock_cleanup.sh"

 shopt -s nullglob


 WORKSPACE="$PWD"

 # Make clear this is the cleanup, helps reading the jenkins logs
 cat <>>> ___
 ###
 # #
 #   CLEANUP   #
 # #
 ###
 EOC


 # Archive the logs, we want them anyway
 logs=(
 ./*log
 ./*/logs
 )
 if [[ "$logs" ]]; then
 tar cvzf exported-artifacts/logs.tgz "${logs[@]}"
 rm -rf "${logs[@]}"
 fi

 # stop any processes running inside the chroot
 failed=false
 mock_confs=("$WORKSPACE"/*/mocker*)
 # Clean current jobs mockroot if any
 for mock_conf_file in "${mock_confs[@]}"; do
 [[ "$mock_conf_file" ]] || c

Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32

2016-07-19 Thread Tolik Litovsky
Is it only for master job ?
Or we have such repos for all branches ?

On Tue, Jul 19, 2016 at 10:13 AM, Eyal Edri  wrote:

> Ryan/Tolik,
> Can you build appliance only from tested engine repo [1] ? lets see how it
> affects the stability, next step will be to publish tested appliance after
> it runs Lago verification.
>
> [1]
> http://resources.ovirt.org/repos/ovirt/experimental/master/latest.tested/
> (published only after ovirt-system-tests basic suite finish successfully)
>
>
> On Tue, Jul 19, 2016 at 10:10 AM, Lev Veyde  wrote:
>
>> Hi Eyal,
>>
>> The last failed run failed on:
>> *15:50:02* [ INFO ] Extracting disk image from OVF archive (could take a
>> few minutes depending on archive size)
>> *21:35:04* Build timed out (after 360 minutes). Marking the build as
>> failed.
>>
>> So it basically got stuck while extracting the OVF image.
>>
>> Some previous runs failed mostly on either:
>> a) broken ovirt-engine-appliance build
>> b) ovirt-engine-appliance missing from the yum repo
>>
>> We need to make sure that the process of building and publishing the
>> ovirt-engine-appliance works flawlessly e.g. build ovirt-engine, publish it
>> into the repo so that the build of the appliance can work, then publish it
>> to the repo as well.
>> This is extra important as the hosted-engine flow installation will
>> probably become the default one, and without synced ovirt appliance we
>> can't really test the changes in the engine.
>>
>> Thanks in advance,
>> Lev Veyde.
>>
>> --
>> *From: *"Eyal Edri" 
>> *To: *jenk...@jenkins.phx.ovirt.org
>> *Cc: *"infra" , "Lev Veyde" ,
>> sbona...@redhat.com
>> *Sent: *Tuesday, July 19, 2016 8:26:22 AM
>> *Subject: *Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32
>>
>>
>> Lev,  this test is a bit flaky going from stable to failure quite often,
>> can you check what is causing it?
>> On Jul 19, 2016 12:35 AM,  wrote:
>>
>>> See 
>>>
>>> Changes:
>>>
>>> [Lev Veyde] ovirt-system-tests: Add automation for
>>> he_iscsi_basic_suite_4.0
>>>
>>> [Sandro Bonazzola] vdsm: avoid fc24 out of master
>>>
>>> [Sandro Bonazzola] ovirt-engine: add 3.6.8 branch testing
>>>
>>> --
>>> [...truncated 620 lines...]
>>>
>>> WORKSPACE="$PWD"
>>> OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
>>> TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
>>>
>>> rm -rf "$WORKSPACE/exported-artifacts"
>>> mkdir -p "$WORKSPACE/exported-artifacts"
>>>
>>> if [[ -d "$TESTS_LOGS" ]]; then
>>> mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
>>> fi
>>>
>>> [ovirt_4.0_he-system-tests] $ /bin/bash -xe
>>> /tmp/hudson1764906258788527221.sh
>>> + echo shell_scripts/system_tests.collect_logs.sh
>>> shell_scripts/system_tests.collect_logs.sh
>>> + VERSION=4.0
>>> + SUITE_TYPE=
>>> + WORKSPACE=
>>> + OVIRT_SUITE=4.0
>>> + TESTS_LOGS=<
>>> http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts
>>> >
>>> + rm -rf <
>>> http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/exported-artifacts
>>> >
>>> + mkdir -p <
>>> http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/exported-artifacts
>>> >
>>> + [[ -d <
>>> http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts>
>>> ]]
>>> POST BUILD TASK : SUCCESS
>>> END OF POST BUILD TASK : 0
>>> Match found for :.* : True
>>> Logical operation result is TRUE
>>> Running script  : #!/bin/bash -xe
>>> echo "shell-scripts/mock_cleanup.sh"
>>>
>>> shopt -s nullglob
>>>
>>>
>>> WORKSPACE="$PWD"
>>>
>>> # Make clear this is the cleanup, helps reading the jenkins logs
>>> cat <>> ___
>>> ###
>>> # #
>>> #   CLEANUP   #
>>> # #
>>> ###
>>> EOC
>>>
>>>
>>> # Archive the logs, we want them anyway
>>> logs=(
>>> ./*log
>>> ./*/logs
>>> )
>>> if [[ "$logs" ]]; then
>>> tar cvzf exported-artifacts/logs.tgz "${logs[@]}"
>>> rm -rf "${logs[@]}"
>>> fi
>>>
>>> # stop any processes running inside the chroot
>>> failed=false
>>> mock_confs=("$WORKSPACE"/*/mocker*)
>>> # Clean current jobs mockroot if any
>>> for mock_conf_file in "${mock_confs[@]}"; do
>>> [[ "$mock_conf_file" ]] || continue
>>> echo "Cleaning up mock $mock_conf"
>>> mock_root="${mock_conf_file##*/}"
>>> mock_root="${mock_root%.*}"
>>> my_mock="/usr/bin/mock"
>>> my_mock+=" --configdir=${mock_conf_file%/*}"
>>> my_mock+=" --root=${mock_root}"
>>> my_mock+=" --resultdir=$WORK

Re: ovirt-engine_master_coverity-analysis is broken

2016-07-19 Thread Sharon Naftaly
There was a problem on their side (a symlink missing).
They fixed it, but i still don't see the analysis result on
scan.coverity.com
website (though the job finished successfully), so I still have the ticket
open.

On Mon, Jul 18, 2016 at 11:42 AM, Sharon Naftaly 
wrote:

> Opened a ticket: Case# 00553847
> Will update here.
>
> On Mon, Jul 18, 2016 at 11:14 AM, Eyal Edri  wrote:
>
>> Please open a ticket to supp...@coverity.com  and ask what is the new
>> approach / url.
>> The coverity plugin isn't relevant for our usecase, its for enterprise
>> not community.
>>
>> On Mon, Jul 18, 2016 at 10:56 AM, Sharon Naftaly 
>> wrote:
>>
>>> Looks like coverity changed their website recently. When trying to run
>>> wget command it fails with 404 not found page.
>>> Do you know who wrote the job's script? We may need to change it but in
>>> order to get to different parts of the website a user/password is needed.
>>> There is also a coverity plugin for Jenkins. Is there a reason we don't
>>> use it?
>>>
>>> On Mon, Jul 18, 2016 at 9:53 AM, Sharon Naftaly 
>>> wrote:
>>>
 I'll try to fix it asap.
 Sharon

 On Mon, Jul 18, 2016 at 8:59 AM, Sandro Bonazzola 
 wrote:

>
>
> On Sun, Jul 17, 2016 at 2:57 PM, Barak Korren 
> wrote:
>
>> Hi Sharon,
>>
>> The $subject job started failing and Eyal told me you worked on it
>> recently. I'm not sure your changed made it fail because I saw changes
>> by both you and Sandro.
>>
>>
> Looks like it started failing after:
> http://jenkins.ovirt.org/job/ovirt-engine_master_coverity-analysis/jobConfigHistory/showDiffFiles?timestamp1=2016-07-04_15-55-39×tamp2=2016-07-12_12-17-29
>
>
>
>> Anyway the problem with the jobs seems to be that the jenkins repo URL
>> was set to:
>>
>> git:///jenkins.git
>>
>> Instead of:
>>
>> git://gerrit.ovirt.org/jenkins.git
>>
>> I fixed this manually but it is still failing. Now it seems that it is
>> not getting the right thing from:
>>
>> https://scan.coverity.com/download/linux-64
>>
>> I can't find any docs about this when looking quickly at coverity`s
>> website, so I'm hoping you know more about this.
>>
>>
>> --
>> Barak Korren
>> bkor...@redhat.com
>> RHEV-CI Team
>>
>
>
>
> --
> Sandro Bonazzola
> Better technology. Faster innovation. Powered by community
> collaboration.
> See how it works at redhat.com
>


>>>
>>
>>
>> --
>> Eyal Edri
>> Associate Manager
>> RHEV DevOps
>> EMEA ENG Virtualization R&D
>> Red Hat Israel
>>
>> phone: +972-9-7692018
>> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>>
>
>
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32

2016-07-19 Thread Sandro Bonazzola
On Tue, Jul 19, 2016 at 9:13 AM, Eyal Edri  wrote:

> Ryan/Tolik,
> Can you build appliance only from tested engine repo [1] ? lets see how it
> affects the stability, next step will be to publish tested appliance after
> it runs Lago verification.
>
> [1]
> http://resources.ovirt.org/repos/ovirt/experimental/master/latest.tested/
> (published only after ovirt-system-tests basic suite finish successfully)
>
>
does the above repo take rpms from same jobs used by nightly publisher?
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra


Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32

2016-07-19 Thread Eyal Edri
Ryan/Tolik,
Can you build appliance only from tested engine repo [1] ? lets see how it
affects the stability, next step will be to publish tested appliance after
it runs Lago verification.

[1]
http://resources.ovirt.org/repos/ovirt/experimental/master/latest.tested/
(published only after ovirt-system-tests basic suite finish successfully)


On Tue, Jul 19, 2016 at 10:10 AM, Lev Veyde  wrote:

> Hi Eyal,
>
> The last failed run failed on:
> *15:50:02* [ INFO ] Extracting disk image from OVF archive (could take a
> few minutes depending on archive size)
> *21:35:04* Build timed out (after 360 minutes). Marking the build as
> failed.
>
> So it basically got stuck while extracting the OVF image.
>
> Some previous runs failed mostly on either:
> a) broken ovirt-engine-appliance build
> b) ovirt-engine-appliance missing from the yum repo
>
> We need to make sure that the process of building and publishing the
> ovirt-engine-appliance works flawlessly e.g. build ovirt-engine, publish it
> into the repo so that the build of the appliance can work, then publish it
> to the repo as well.
> This is extra important as the hosted-engine flow installation will
> probably become the default one, and without synced ovirt appliance we
> can't really test the changes in the engine.
>
> Thanks in advance,
> Lev Veyde.
>
> --
> *From: *"Eyal Edri" 
> *To: *jenk...@jenkins.phx.ovirt.org
> *Cc: *"infra" , "Lev Veyde" ,
> sbona...@redhat.com
> *Sent: *Tuesday, July 19, 2016 8:26:22 AM
> *Subject: *Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32
>
>
> Lev,  this test is a bit flaky going from stable to failure quite often,
> can you check what is causing it?
> On Jul 19, 2016 12:35 AM,  wrote:
>
>> See 
>>
>> Changes:
>>
>> [Lev Veyde] ovirt-system-tests: Add automation for
>> he_iscsi_basic_suite_4.0
>>
>> [Sandro Bonazzola] vdsm: avoid fc24 out of master
>>
>> [Sandro Bonazzola] ovirt-engine: add 3.6.8 branch testing
>>
>> --
>> [...truncated 620 lines...]
>>
>> WORKSPACE="$PWD"
>> OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION"
>> TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts"
>>
>> rm -rf "$WORKSPACE/exported-artifacts"
>> mkdir -p "$WORKSPACE/exported-artifacts"
>>
>> if [[ -d "$TESTS_LOGS" ]]; then
>> mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/"
>> fi
>>
>> [ovirt_4.0_he-system-tests] $ /bin/bash -xe
>> /tmp/hudson1764906258788527221.sh
>> + echo shell_scripts/system_tests.collect_logs.sh
>> shell_scripts/system_tests.collect_logs.sh
>> + VERSION=4.0
>> + SUITE_TYPE=
>> + WORKSPACE=
>> + OVIRT_SUITE=4.0
>> + TESTS_LOGS=<
>> http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts
>> >
>> + rm -rf <
>> http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/exported-artifacts
>> >
>> + mkdir -p <
>> http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/exported-artifacts
>> >
>> + [[ -d <
>> http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts>
>> ]]
>> POST BUILD TASK : SUCCESS
>> END OF POST BUILD TASK : 0
>> Match found for :.* : True
>> Logical operation result is TRUE
>> Running script  : #!/bin/bash -xe
>> echo "shell-scripts/mock_cleanup.sh"
>>
>> shopt -s nullglob
>>
>>
>> WORKSPACE="$PWD"
>>
>> # Make clear this is the cleanup, helps reading the jenkins logs
>> cat <> ___
>> ###
>> # #
>> #   CLEANUP   #
>> # #
>> ###
>> EOC
>>
>>
>> # Archive the logs, we want them anyway
>> logs=(
>> ./*log
>> ./*/logs
>> )
>> if [[ "$logs" ]]; then
>> tar cvzf exported-artifacts/logs.tgz "${logs[@]}"
>> rm -rf "${logs[@]}"
>> fi
>>
>> # stop any processes running inside the chroot
>> failed=false
>> mock_confs=("$WORKSPACE"/*/mocker*)
>> # Clean current jobs mockroot if any
>> for mock_conf_file in "${mock_confs[@]}"; do
>> [[ "$mock_conf_file" ]] || continue
>> echo "Cleaning up mock $mock_conf"
>> mock_root="${mock_conf_file##*/}"
>> mock_root="${mock_root%.*}"
>> my_mock="/usr/bin/mock"
>> my_mock+=" --configdir=${mock_conf_file%/*}"
>> my_mock+=" --root=${mock_root}"
>> my_mock+=" --resultdir=$WORKSPACE"
>>
>> #TODO: investigate why mock --clean fails to umount certain dirs
>> sometimes,
>> #so we can use it instead of manually doing all this.
>> echo "Killing all mock orphan processes, if any."
>> $my_mock \
>> --orphanskill \
>> || {
>

Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32

2016-07-19 Thread Lev Veyde
Hi Eyal, 

The last failed run failed on: 
15:50:02 [ INFO ] Extracting disk image from OVF archive (could take a few 
minutes depending on archive size) 
21:35:04 Build timed out (after 360 minutes). Marking the build as failed. 

So it basically got stuck while extracting the OVF image. 

Some previous runs failed mostly on either: 
a) broken ovirt-engine-appliance build 
b) ovirt-engine-appliance missing from the yum repo 

We need to make sure that the process of building and publishing the 
ovirt-engine-appliance works flawlessly e.g. build ovirt-engine, publish it 
into the repo so that the build of the appliance can work, then publish it to 
the repo as well. 
This is extra important as the hosted-engine flow installation will probably 
become the default one, and without synced ovirt appliance we can't really test 
the changes in the engine. 

Thanks in advance, 
Lev Veyde. 

- Original Message -

From: "Eyal Edri"  
To: jenk...@jenkins.phx.ovirt.org 
Cc: "infra" , "Lev Veyde" , 
sbona...@redhat.com 
Sent: Tuesday, July 19, 2016 8:26:22 AM 
Subject: Re: Build failed in Jenkins: ovirt_4.0_he-system-tests #32 



Lev, this test is a bit flaky going from stable to failure quite often, can you 
check what is causing it? 
On Jul 19, 2016 12:35 AM, < jenk...@jenkins.phx.ovirt.org > wrote: 


See < http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/changes > 

Changes: 

[Lev Veyde] ovirt-system-tests: Add automation for he_iscsi_basic_suite_4.0 

[Sandro Bonazzola] vdsm: avoid fc24 out of master 

[Sandro Bonazzola] ovirt-engine: add 3.6.8 branch testing 

-- 
[...truncated 620 lines...] 

WORKSPACE="$PWD" 
OVIRT_SUITE="$SUITE_TYPE_suite_$VERSION" 
TESTS_LOGS="$WORKSPACE/ovirt-system-tests/exported-artifacts" 

rm -rf "$WORKSPACE/exported-artifacts" 
mkdir -p "$WORKSPACE/exported-artifacts" 

if [[ -d "$TESTS_LOGS" ]]; then 
mv "$TESTS_LOGS/"* "$WORKSPACE/exported-artifacts/" 
fi 

[ovirt_4.0_he-system-tests] $ /bin/bash -xe /tmp/hudson1764906258788527221.sh 
+ echo shell_scripts/ system_tests.collect_logs.sh 
shell_scripts/ system_tests.collect_logs.sh 
+ VERSION=4.0 
+ SUITE_TYPE= 
+ WORKSPACE=< http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ > 
+ OVIRT_SUITE=4.0 
+ TESTS_LOGS=< 
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts
 > 
+ rm -rf < 
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/exported-artifacts
 > 
+ mkdir -p < 
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/32/artifact/exported-artifacts
 > 
+ [[ -d < 
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/exported-artifacts
 > ]] 
POST BUILD TASK : SUCCESS 
END OF POST BUILD TASK : 0 
Match found for :.* : True 
Logical operation result is TRUE 
Running script : #!/bin/bash -xe 
echo "shell-scripts/mock_cleanup.sh" 

shopt -s nullglob 


WORKSPACE="$PWD" 

# Make clear this is the cleanup, helps reading the jenkins logs 
cat  
+ cat 
___ 
### 
# # 
# CLEANUP # 
# # 
### 
+ logs=(./*log ./*/logs) 
+ [[ -n ./ovirt-system-tests/logs ]] 
+ tar cvzf exported-artifacts/logs.tgz ./ovirt-system-tests/logs 
./ovirt-system-tests/logs/ 
./ovirt-system-tests/logs/ mocker-fedora-23-x86_64.fc23.he_basic_suite_4.0.sh/ 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.he_basic_suite_4.0.sh/he_basic_suite_4.0.sh.log
 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/ 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.clean_rpmdb/stdout_stderr.log
 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/ 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/root.log
 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/build.log
 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/state.log
 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.install_packages/stdout_stderr.log
 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/ 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/root.log 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/build.log 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/state.log 
./ovirt-system-tests/logs/mocker-fedora-23-x86_64.fc23.init/stdout_stderr.log 
+ rm -rf ./ovirt-system-tests/logs 
+ failed=false 
+ mock_confs=("$WORKSPACE"/*/mocker*) 
+ for mock_conf_file in '"${mock_confs[@]}"' 
+ [[ -n < 
http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/ws/ovirt-system-tests/mocker-fedora-23-x86_64.fc23.cfg
 > ]] 
+ echo 'Cleaning up mock ' 
Cleaning up mock 
+ mock_root=mocker-fedora-23-x86_64.fc23.cfg 
+ mock_root=mocker-fedora-23-x86_64.fc23 
+ my_mock=

[JIRA] (OVIRT-641) Build oVirt 4.0.2 RC1

2016-07-19 Thread sbonazzo (oVirt JIRA)
sbonazzo created OVIRT-641:
--

 Summary: Build oVirt 4.0.2 RC1
 Key: OVIRT-641
 URL: https://ovirt-jira.atlassian.net/browse/OVIRT-641
 Project: oVirt - virtualization made easy
  Issue Type: Task
  Components: Repositories Mgmt
Reporter: sbonazzo
Assignee: infra
Priority: Highest






--
This message was sent by Atlassian JIRA
(v1000.148.3#15)
___
Infra mailing list
Infra@ovirt.org
http://lists.ovirt.org/mailman/listinfo/infra