[ovirt-devel] job ovirt-system-tests_manual

2017-10-30 Thread Shlomo Ben David
Hi All,

The 'ovirt-system-tests_manual' job currently run with no limits on how
many builds to keep.
We want to limit the kept builds for 14 days only.

If you are using the job and you need to keep your build(s) more than 14
days, please let us know.

Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA  | RHCE | RHCVA
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt master ] [ 29-10-2017 ] [ 098_ovirt_provider_ovn.test_ovn_provider_rest ]

2017-10-29 Thread Shlomo Ben David
Hi,

Link to suspected patches: https://gerrit.ovirt.org/#/c/83158/

Link to Job:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3525/console

Link to all logs:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3525/artifact/exported-artifacts/basic-suit-master-el7/

(Relevant) error snippet from the log:


  File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
  File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 129,
in wrapped_test
test()
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 59, in
wrapper
return func(get_test_prefix(), *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 78, in
wrapper
prefix.virt_env.engine_vm().get_api(api_ver=4), *args, **kwargs
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 59, in
wrapper
return func(get_test_prefix(), *args, **kwargs)
  File
"/home/jenkins/workspace/ovirt-master_change-queue-tester/ovirt-system-tests/basic-suite-master/test-scenarios/098_ovirt_provider_ovn.py",
line 441, in test_ovn_provider_rest
_remove_iface_from_vm(api, VM0_NAME, IFACE_NAME)
  File
"/home/jenkins/workspace/ovirt-master_change-queue-tester/ovirt-system-tests/basic-suite-master/test-scenarios/098_ovirt_provider_ovn.py",
line 400, in _remove_iface_from_vm
nic_service.remove()
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line
27808, in remove
self._internal_remove(headers, query, wait)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 262,
in _internal_remove
return future.wait() if wait else future
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 53,
in wait
return self._code(response)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 259,
in callback
self._check_fault(response)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 123,
in _check_fault
self._raise_error(response, body)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 109,
in _raise_error
raise error
'Fault reason is "Operation Failed". Fault detail is "[Cannot remove
Interface. The VM Network Interface is plugged to a running VM.]".



Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA  | RHCE | RHCVA
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt master ] [ 26-10-2017 ] [ 004_basic_sanity.add_nic ]

2017-10-26 Thread Shlomo Ben David
Hi,

Link to suspected patches: https://gerrit.ovirt.org/#/c/71622/6

Link to Job:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3476/console

Link to all logs:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/3476/artifact/exported-artifacts/basic-suit-master-el7/

(Relevant) error snippet from the log:


Traceback (most recent call last): File
"/usr/lib64/python2.7/unittest/case.py", line 369, in run testMethod() File
"/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg) File
"/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 129, in
wrapped_test test() File
"/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 59, in
wrapper return func(get_test_prefix(), *args, **kwargs) File
"/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 68, in
wrapper return func(prefix.virt_env.engine_vm().get_api(), *args, **kwargs)
File
"/home/jenkins/workspace/ovirt-master_change-queue-tester/ovirt-system-tests/basic-suite-master/test-scenarios/004_basic_sanity.py",
line 147, in add_nic api.vms.get(VM2_NAME).nics.add(nic_params) File
"/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/brokers.py", line
33398, in add headers={"Correlation-Id":correlation_id, "Expect":expect}
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/proxy.py",
line 79, in add return self.request('POST', url, body, headers, cls=cls)
File "/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/proxy.py",
line 122, in request persistent_auth=self.__persistent_auth File
"/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/connectionspool.py",
line 79, in do_request persistent_auth) File
"/usr/lib/python2.7/site-packages/ovirtsdk/infrastructure/connectionspool.py",
line 162, in __do_request raise errors.RequestError(response_code,
response_reason, response_body) RequestError: status: 400 reason: Bad
Request detail:




Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA  | RHCE | RHCVA
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt master ] [ 01-10-2017 ] [ change-queue-tester ]

2017-10-01 Thread Shlomo Ben David
Hi,

Link to suspected patches: https://gerrit.ovirt.org/#/c/82350/2
Link to Job: http://jenkins.ovirt.org/job/ovirt-master_change-queue-
tester/2949/

Link to all logs:
http://jenkins.ovirt.org/job/ovirt-master_change-queue-tester/2949/artifact/exported-artifacts/basic-suit-master-el7/

(Relevant) error snippet from the log:



lago.utils: ERROR: Error while running thread Traceback (most recent call
last): File "/usr/lib/python2.7/site-packages/lago/utils.py", line 58, in
_ret_via_queue queue.put({'return': func()}) File
"/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 59, in
wrapper return func(get_test_prefix(), *args, **kwargs) File
"/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 78, in
wrapper prefix.virt_env.engine_vm().get_api(api_ver=4), *args, **kwargs
File 
"/home/jenkins/workspace/ovirt-master_change-queue-tester@2/ovirt-system-tests/basic-suite-master/test-scenarios/004_basic_sanity.py",
line 372, in live_storage_migration disk_service.get().status,
types.DiskStatus.OK) File
"/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 258, in
assert_equals_within_long func, value, LONG_TIMEOUT,
allowed_exceptions=allowed_exceptions File
"/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 237, in
assert_equals_within '%s != %s after %s seconds' % (res, value, timeout)
AssertionError: False != ok after 600 seconds




Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA  | RHCE | RHCVA
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt master ] [ 03-07-2017 ] [ 006_migrations.migrate_vm ]

2017-07-03 Thread Shlomo Ben David
Hi,

Test failed: [ 006_migrations.migrate_vm ]
Link to suspected patches: N/A
Link to Job:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/7431/
Link to all logs:
Error snippet from the log:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/7431/artifact/exported-artifacts/basic-suit-master-el7/test_logs/basic-suite-master/post-006_migrations.py/



 "Fault reason is "Operation Failed". Fault detail is "[Cannot migrate VM.
There is no host that satisfies current scheduling constraints. See below
for details:, The host lago-basic-suite-master-host0 did not satisfy
internal filter CPUOverloaded because its CPU is too loaded.]"





2017-07-02 16:43:22,829-04 INFO
 [org.ovirt.engine.core.bll.MigrateVmToServerCommand] (default task-27)
[87508047-fdc5-4a2f-9692-c83f7b55bbc2] Lock Acquired to object
'EngineLock:{exclusiveLocks='[2b34910d-cef2-44d6-a274-30e8473eb5d9=VM]',
sharedLocks=''}'
2017-07-02 16:43:22,833-04 DEBUG
[org.ovirt.engine.core.dal.dbbroker.PostgresDbEngineDialect$PostgresSimpleJdbcCall]
(default task-27) [87508047-fdc5-4a2f-9692-c83f7b55bbc2] Compiled stored
procedure. Call string is [{call getdiskvmelementspluggedtovm(?)}]
2017-07-02 16:43:22,833-04 DEBUG
[org.ovirt.engine.core.dal.dbbroker.PostgresDbEngineDialect$PostgresSimpleJdbcCall]
(default task-27) [87508047-fdc5-4a2f-9692-c83f7b55bbc2] SqlCall for
procedure [GetDiskVmElementsPluggedToVm] compiled
2017-07-02 16:43:22,843-04 DEBUG
[org.ovirt.engine.core.dal.dbbroker.PostgresDbEngineDialect$PostgresSimpleJdbcCall]
(default task-27) [87508047-fdc5-4a2f-9692-c83f7b55bbc2] Compiled stored
procedure. Call string is [{call getattacheddisksnapshotstovm(?, ?)}]
2017-07-02 16:43:22,843-04 DEBUG
[org.ovirt.engine.core.dal.dbbroker.PostgresDbEngineDialect$PostgresSimpleJdbcCall]
(default task-27) [87508047-fdc5-4a2f-9692-c83f7b55bbc2] SqlCall for
procedure [GetAttachedDiskSnapshotsToVm] compiled
2017-07-02 16:43:22,919-04 INFO
 [org.ovirt.engine.core.bll.scheduling.SchedulingManager] (default task-27)
[87508047-fdc5-4a2f-9692-c83f7b55bbc2] Candidate host
'lago-basic-suite-master-host0' ('46bdc63d-98f5-4eee-81aa-2fb88b8f7cbe')
was filtered out by 'VAR__FILTERTYPE__INTERNAL' filter 'CPUOverloaded'
(correlation id: null)
2017-07-02 16:43:22,920-04 WARN
 [org.ovirt.engine.core.bll.MigrateVmToServerCommand] (default task-27)
[87508047-fdc5-4a2f-9692-c83f7b55bbc2] Validation of action
'MigrateVmToServer' failed for user admin@internal-authz. Reasons:
VAR__ACTION__MIGRATE,VAR__TYPE__VM,SCHEDULING_ALL_HOSTS_FILTERED_OUT,VAR__FILTERTYPE__INTERNAL,$hostName
lago-basic-suite-master-host0,$filterName
CPUOverloaded,VAR__DETAIL__CPU_OVERLOADED,SCHEDULING_HOST_FILTERED_REASON_WITH_DETAIL
2017-07-02 16:43:22,920-04 INFO
 [org.ovirt.engine.core.bll.MigrateVmToServerCommand] (default task-27)
[87508047-fdc5-4a2f-9692-c83f7b55bbc2] Lock freed to object
'EngineLock:{exclusiveLocks='[2b34910d-cef2-44d6-a274-30e8473eb5d9=VM]',
sharedLocks=''}'
2017-07-02 16:43:22,929-04 DEBUG
[org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
(DefaultQuartzScheduler7) [] Rescheduling
DEFAULT.org.ovirt.engine.core.bll.ColdRebootAutoStartVmsRunner.startFailedAutoStartVms#-9223372036854775733
as there is no unfired trigger.
2017-07-02 16:43:22,932-04 ERROR
[org.ovirt.engine.api.restapi.resource.AbstractBackendResource] (default
task-27) [] Operation Failed: [Cannot migrate VM. There is no host that
satisfies current scheduling constraints. See below for details:, The host
lago-basic-suite-master-host0 did not satisfy internal filter CPUOverloaded
because its CPU is too loaded.]
2017-07-02 16:43:23,331-04 DEBUG
[org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
(DefaultQuartzScheduler2) [] Rescheduling
DEFAULT.org.ovirt.engine.core.bll.HaAutoStartVmsRunner.startFailedAutoStartVms#-9223372036854775793
as there is no unfired trigger.
2017-07-02 16:43:23,332-04 DEBUG
[org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
(DefaultQuartzScheduler2) [] Rescheduling
DEFAULT.org.ovirt.engine.core.bll.tasks.CommandCallbacksPoller.invokeCallbackMethods#-9223372036854775783
as there is no unfired trigger.





Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt master ] [ 14/05/2017 ] [ test-repo_ovirt_experimental_master ]

2017-05-14 Thread Shlomo Ben David
Hi,


Test failed: [ test-repo_ovirt_experimental_master ]

Link to suspected patches: N/A
Link to Job:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/6665/
Link to all logs:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/6665/artifact/exported-artifacts/basic-suit-master-el7/test_logs/basic-suite-master/post-006_migrations.py/

Error snippet from the log:



2017-05-14 03:56:31,782-0400 ERROR (jsonrpc/3) [storage.TaskManager.Task]
(Task='661acb48-075b-494a-8fb4-64c04ed99bcb') Unexpected error (task:871)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 878,
in _run
return fn(*args, **kargs)
  File "/usr/lib/python2.7/site-packages/vdsm/logUtils.py", line 52, in
wrapper
res = f(*args, **kwargs)
  File "/usr/share/vdsm/storage/hsm.py", line 2172, in getAllTasksInfo
allTasksInfo = self._pool.getAllTasksInfo()
  File "/usr/lib/python2.7/site-packages/vdsm/storage/securable.py", line
77, in wrapper
raise SecureError("Secured object is not in safe state")
SecureError: Secured object is not in safe state
2017-05-14 03:56:31,785-0400 INFO  (jsonrpc/3) [storage.TaskManager.Task]
(Task='661acb48-075b-494a-8fb4-64c04ed99bcb') aborting: Task is aborted:
u'Secured object is not in safe state' - code 100 (task:1176)
2017-05-14 03:56:31,786-0400 ERROR (jsonrpc/3) [storage.Dispatcher] Secured
object is not in safe state (dispatcher:81)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/storage/dispatcher.py", line
73, in wrapper
result = ctask.prepare(func, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 106,
in wrapper
return m(self, *a, **kw)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 1184,
in prepare
raise self.error
SecureError: Secured object is not in safe state
2017-05-14 03:56:31,786-0400 INFO  (jsonrpc/3) [jsonrpc.JsonRpcServer] RPC
call Host.getAllTasksInfo failed (error 654) in 0.00 seconds (__init__:577)
2017-05-14 03:56:31,796-0400 INFO  (jsonrpc/5) [dispatcher] Run and
protect: getAllTasksStatuses(spUUID=None, options=None) (logUtils:51)
2017-05-14 03:56:31,797-0400 ERROR (jsonrpc/5) [storage.TaskManager.Task]
(Task='568f515e-365b-47e4-8b2f-24e15deb6ae9') Unexpected error (task:871)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 878,
in _run
return fn(*args, **kargs)
  File "/usr/lib/python2.7/site-packages/vdsm/logUtils.py", line 52, in
wrapper
res = f(*args, **kwargs)
  File "/usr/share/vdsm/storage/hsm.py", line 2132, in getAllTasksStatuses
allTasksStatus = self._pool.getAllTasksStatuses()
  File "/usr/lib/python2.7/site-packages/vdsm/storage/securable.py", line
77, in wrapper
raise SecureError("Secured object is not in safe state")
SecureError: Secured object is not in safe state
2017-05-14 03:56:31,797-0400 INFO  (jsonrpc/5) [storage.TaskManager.Task]
(Task='568f515e-365b-47e4-8b2f-24e15deb6ae9') aborting: Task is aborted:
u'Secured object is not in safe state' - code 100 (task:1176)
2017-05-14 03:56:31,798-0400 ERROR (jsonrpc/5) [storage.Dispatcher] Secured
object is not in safe state (dispatcher:81)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/storage/dispatcher.py", line
73, in wrapper
result = ctask.prepare(func, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 106,
in wrapper
return m(self, *a, **kw)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 1184,
in prepare
raise self.error
SecureError: Secured object is not in safe state



Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt 4.1 ] [ 24-03-2017 ] [ 002_bootstrap.list_glance_images ]

2017-03-26 Thread Shlomo Ben David
Hi,

Test failed: [ 002_bootstrap.list_glance_images ]
Link to suspected patches: N/A
Link to Job:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1063
Link to all logs:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1063/artifact/exported-artifacts/basic-suit-4.1-el7/test_logs/basic-suite-4.1/post-002_bootstrap.py/

Error snippet from the log:


2017-03-24 08:04:46,983-04 ERROR
[org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand]
(org.ovirt.thread.pool-7-thread-1) [2b2b4693] Command
'PollVDSCommand(HostName = lago-basic-suite-4-1-host1,
VdsIdVDSCommandParametersBase:{runAsync='true',
hostId='746e816f-6e21-4185-9d50-3e90ebefb187'})' execution failed:
VDSGenericException: VDSNetworkException: Timeout during rpc call
2017-03-24 08:04:46,983-04 DEBUG
[org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand]
(org.ovirt.thread.pool-7-thread-1) [2b2b4693] Exception:
org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
VDSGenericException: VDSNetworkException: Timeout during rpc call
at
org.ovirt.engine.core.vdsbroker.vdsbroker.FutureVDSCommand.get(FutureVDSCommand.java:73)
[vdsbroker.jar:]
at
org.ovirt.engine.core.bll.network.host.HostSetupNetworkPoller.getValue(HostSetupNetworkPoller.java:56)
[bll.jar:]
at
org.ovirt.engine.core.bll.network.host.HostSetupNetworkPoller.poll(HostSetupNetworkPoller.java:41)
[bll.jar:]
at
org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand.invokeSetupNetworksCommand(HostSetupNetworksCommand.java:426)
[bll.jar:]
at
org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand.executeCommand(HostSetupNetworksCommand.java:287)
[bll.jar:]
at
org.ovirt.engine.core.bll.CommandBase.executeWithoutTransaction(CommandBase.java:1251)
[bll.jar:]
at
org.ovirt.engine.core.bll.CommandBase.executeActionInTransactionScope(CommandBase.java:1391)
[bll.jar:]
at
org.ovirt.engine.core.bll.CommandBase.runInTransaction(CommandBase.java:2055)
[bll.jar:]
at
org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInSuppressed(TransactionSupport.java:164)
[utils.jar:]
at
org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInScope(TransactionSupport.java:103)
[utils.jar:]
at org.ovirt.engine.core.bll.CommandBase.execute(CommandBase.java:1451)
[bll.jar:]
at
org.ovirt.engine.core.bll.CommandBase.executeAction(CommandBase.java:397)
[bll.jar:]
at
org.ovirt.engine.core.bll.executor.DefaultBackendActionExecutor.execute(DefaultBackendActionExecutor.java:13)
[bll.jar:]
at org.ovirt.engine.core.bll.Backend.runAction(Backend.java:511) [bll.jar:]
at org.ovirt.engine.core.bll.Backend.runActionImpl(Backend.java:493)
[bll.jar:]
at org.ovirt.engine.core.bll.Backend.runInternalAction(Backend.java:697)
[bll.jar:]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[rt.jar:1.8.0_121]
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[rt.jar:1.8.0_121]
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[rt.jar:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) [rt.jar:1.8.0_121]
at
org.jboss.as.ee.component.ManagedReferenceMethodInterceptor.processInvocation(ManagedReferenceMethodInterceptor.java:52)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.InterceptorContext$Invocation.proceed(InterceptorContext.java:437)
at
org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.delegateInterception(Jsr299BindingsInterceptor.java:70)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.doMethodInterception(Jsr299BindingsInterceptor.java:80)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.processInvocation(Jsr299BindingsInterceptor.java:93)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.as.ee.component.interceptors.UserInterceptorFactory$1.processInvocation(UserInterceptorFactory.java:63)
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ejb3.component.invocationmetrics.ExecutionTimeInterceptor.processInvocation(ExecutionTimeInterceptor.java:43)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.invocation.InterceptorContext$Invocation.proceed(InterceptorContext.java:437)
at
org.jboss.weld.ejb.AbstractEJBRequestScopeActivationInterceptor.aroundInvoke(AbstractEJBRequestScopeActivationInterceptor.java:64)
[weld-core-impl-2.3.5.Final.jar:2.3.5.Final]
at
org.jboss.as.weld.ejb.EjbRequestScopeActivationInterceptor.processInvocation(EjbRequestScopeActivationInterceptor.java:83)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at
org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:340)
at
org.jboss.as.ee.concurrent.ConcurrentContextInterceptor.processInvocation(ConcurrentContextInterceptor.java:45)
[wildfly-ee-10.1.0.Final.jar:10.1.0.Final]
at

[ovirt-devel] [ OST Failure Report ] [ oVirt master ] [ 24-03-2017 ] [ 002_bootstrap.list_glance_images ]

2017-03-26 Thread Shlomo Ben David
Hi,

Test failed: [ 002_bootstrap.list_glance_images ]
Link to suspected patches: N/A
Link to Job:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/5997
Link to all logs:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/5997/artifact/exported-artifacts/basic-suit-master-el7/test_logs/basic-suite-master/post-002_bootstrap.py/

Error snippet from the log:


2017-03-23 20:19:39,699-04 ERROR
[org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand]
(org.ovirt.thread.pool-7-thread-2) [796c74] Command
'PollVDSCommand(HostName = lago-basic-suite-master-host1,
VdsIdVDSCommandParametersBase:{runAsync='true',
hostId='e8fa4803-7372-41d9-b5a5-848d08773479'})'
execution failed: VDSGenericException: VDSNetworkException: Timeout during
rpc call
2017-03-23 20:19:39,699-04 DEBUG
[org.ovirt.engine.core.vdsbroker.vdsbroker.PollVDSCommand]
(org.ovirt.thread.pool-7-thread-2) [796c74] Exception:
org.ovirt.engine.core.vdsbroker.vdsbroker.VDSNetworkException:
VDSGenericException: VDSNetworkException: Timeout during rpc call
at 
org.ovirt.engine.core.vdsbroker.vdsbroker.FutureVDSCommand.get(FutureVDSCommand.java:74)
[vdsbroker.jar:]
at org.ovirt.engine.core.bll.network.host.HostSetupNetworkPoller.getValue(
HostSetupNetworkPoller.java:56) [bll.jar:]
at org.ovirt.engine.core.bll.network.host.HostSetupNetworkPoller.poll(
HostSetupNetworkPoller.java:41) [bll.jar:]
at org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand.
invokeSetupNetworksCommand(HostSetupNetworksCommand.java:422) [bll.jar:]
at org.ovirt.engine.core.bll.network.host.HostSetupNetworksCommand.
executeCommand(HostSetupNetworksCommand.java:290) [bll.jar:]
at 
org.ovirt.engine.core.bll.CommandBase.executeWithoutTransaction(CommandBase.java:1253)
[bll.jar:]
at 
org.ovirt.engine.core.bll.CommandBase.executeActionInTransactionScope(CommandBase.java:1393)
[bll.jar:]
at org.ovirt.engine.core.bll.CommandBase.runInTransaction(CommandBase.java:2057)
[bll.jar:]
at org.ovirt.engine.core.utils.transaction.TransactionSupport.
executeInSuppressed(TransactionSupport.java:164) [utils.jar:]
at org.ovirt.engine.core.utils.transaction.TransactionSupport.
executeInScope(TransactionSupport.java:103) [utils.jar:]
at org.ovirt.engine.core.bll.CommandBase.execute(CommandBase.java:1453)
[bll.jar:]
at org.ovirt.engine.core.bll.CommandBase.executeAction(CommandBase.java:397)
[bll.jar:]
at org.ovirt.engine.core.bll.executor.DefaultBackendActionExecutor.execute(
DefaultBackendActionExecutor.java:13) [bll.jar:]
at org.ovirt.engine.core.bll.Backend.runAction(Backend.java:496) [bll.jar:]
at org.ovirt.engine.core.bll.Backend.runActionImpl(Backend.java:478)
[bll.jar:]
at org.ovirt.engine.core.bll.Backend.runInternalAction(Backend.java:684)
[bll.jar:]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[rt.jar:1.8.0_121]
at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:62) [rt.jar:1.8.0_121]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:43) [rt.jar:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) [rt.jar:1.8.0_121]
at org.jboss.as.ee.component.ManagedReferenceMethodIntercep
tor.processInvocation(ManagedReferenceMethodInterceptor.java:52)
at org.jboss.invocation.InterceptorContext.proceed(
InterceptorContext.java:340)
at org.jboss.invocation.InterceptorContext$Invocation.
proceed(InterceptorContext.java:437)
at org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.delegateInterception(
Jsr299BindingsInterceptor.java:70) [wildfly-weld-10.1.0.Final.
jar:10.1.0.Final]
at org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.doMethodInterception(
Jsr299BindingsInterceptor.java:80) [wildfly-weld-10.1.0.Final.
jar:10.1.0.Final]
at org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.processInvocation(
Jsr299BindingsInterceptor.java:93) [wildfly-weld-10.1.0.Final.
jar:10.1.0.Final]
at org.jboss.as.ee.component.interceptors.UserInterceptorFactory$1.
processInvocation(UserInterceptorFactory.java:63)
at org.jboss.invocation.InterceptorContext.proceed(
InterceptorContext.java:340)
at org.jboss.as.ejb3.component.invocationmetrics.ExecutionTimeInterceptor.
processInvocation(ExecutionTimeInterceptor.java:43)
[wildfly-ejb3-10.1.0.Final.jar:10.1.0.Final]
at org.jboss.invocation.InterceptorContext.proceed(
InterceptorContext.java:340)
at org.jboss.invocation.InterceptorContext$Invocation.
proceed(InterceptorContext.java:437)
at org.jboss.weld.ejb.AbstractEJBRequestScopeActivat
ionInterceptor.aroundInvoke(AbstractEJBRequestScopeActivationInterceptor.java:64)
[weld-core-impl-2.3.5.Final.jar:2.3.5.Final]
at org.jboss.as.weld.ejb.EjbRequestScopeActivationInter
ceptor.processInvocation(EjbRequestScopeActivationInterceptor.java:83)
[wildfly-weld-10.1.0.Final.jar:10.1.0.Final]
at org.jboss.invocation.InterceptorContext.proceed(
InterceptorContext.java:340)
at org.jboss.as.ee.concurrent.ConcurrentContextInterceptor.
processInvocation(ConcurrentContextInterceptor.java:45)

[ovirt-devel] [ OST Failure Report ] [ oVirt 4.1 ] [ 23-03-2017 ] [ 002_bootstrap.add_secondary_storage_domains ]

2017-03-23 Thread Shlomo Ben David
Hi,


Test failed: [ 002_bootstrap.add_secondary_storage_domains ]
Link to suspected patches: N/A
Link to Job:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1054
Link to all logs:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1054/artifact/exported-artifacts/basic-suit-4.1-el7/test_logs/basic-suite-4.1/post-002_bootstrap.py/

Error snippet from the log:



2017-03-23 07:23:38,867-0400 ERROR (jsonrpc/2) [storage.TaskManager.Task]
(Task='8347d74c-92fe-4371-bc84-1314a43a2971') Unexpected error (task:870)
Traceback (most recent call last):
  File "/usr/share/vdsm/storage/task.py", line 877, in _run
return fn(*args, **kargs)
  File "/usr/lib/python2.7/site-packages/vdsm/logUtils.py", line 52, in
wrapper
res = f(*args, **kwargs)
  File "/usr/share/vdsm/storage/hsm.py", line 1159, in attachStorageDomain
pool.attachSD(sdUUID)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/securable.py", line
79, in wrapper
return method(self, *args, **kwargs)
  File "/usr/share/vdsm/storage/sp.py", line 924, in attachSD
dom = sdCache.produce(sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 112, in produce
domain.getRealDomain()
  File "/usr/share/vdsm/storage/sdc.py", line 53, in getRealDomain
return self._cache._realProduce(self._sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 136, in _realProduce
domain = self._findDomain(sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 153, in _findDomain
return findMethod(sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 178, in _findUnfetchedDomain
raise se.StorageDomainDoesNotExist(sdUUID)
StorageDomainDoesNotExist: Storage domain does not exist:
(u'127fbe66-204c-4c6d-b521-d0f431af2b6c',)




Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt master ] [ 21-03-2017 ] [test-repo_ovirt_experimental_master]

2017-03-21 Thread Shlomo Ben David
Hi,


Test failed: [ test-repo_ovirt_experimental_master ]

Link to suspected patches: N/A

Link to Job:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/5945

Link to all logs:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/5945/artifact/exported-artifacts/basic-suit-master-el7/test_logs/basic-suite-master/post-002_bootstrap.py/


Error snippet from the log:



2017-03-21 11:55:15,975-0400 ERROR (jsonrpc/7) [storage.TaskManager.Task]
(Task='02a8b5a3-ff82-4a07-bc8a-b3a756630e8c') Unexpected error (task:871)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 878,
in _run
return fn(*args, **kargs)
  File "/usr/lib/python2.7/site-packages/vdsm/logUtils.py", line 52, in
wrapper
res = f(*args, **kwargs)
  File "/usr/share/vdsm/storage/hsm.py", line 1158, in attachStorageDomain
pool.attachSD(sdUUID)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/securable.py", line
79, in wrapper
return method(self, *args, **kwargs)
  File "/usr/share/vdsm/storage/sp.py", line 930, in attachSD
dom = sdCache.produce(sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 112, in produce
domain.getRealDomain()
  File "/usr/share/vdsm/storage/sdc.py", line 53, in getRealDomain
return self._cache._realProduce(self._sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 136, in _realProduce
domain = self._findDomain(sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 153, in _findDomain
return findMethod(sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 178, in _findUnfetchedDomain
raise se.StorageDomainDoesNotExist(sdUUID)
StorageDomainDoesNotExist: Storage domain does not exist:
(u'a8739fae-284f-4640-800d-016c084de7e6',)




Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt 4.1 ] [ 21-03-2017 ] [ test-repo_ovirt_experimental_4.1 ]

2017-03-21 Thread Shlomo Ben David
Hi,

Test failed: [ test-repo_ovirt_experimental_4.1 ]
Link to suspected patches: N/A
Link to Job:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1018
Link to all logs:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1018/artifact/exported-artifacts/basic-suit-4.1-el7/test_logs/basic-suite-4.1/post-004_basic_sanity.py/

Error snippet from the log:


ifup/VLAN100_Network::DEBUG::2017-03-21
06:12:03,550::commands::93::root::(execCmd) FAILED:  = 'Running scope
as unit
979f3d61-c1f9-49c2-b168-799b882f64d5.scope.\n/etc/sysconfig/network-scripts/ifup-eth:
line 297: 30633 Terminated  /sbin/dhclient ${DHCLIENTARGS}
${DEVICE}\nCannot find device "VLAN100_Network"\nDevice "VLAN100_Network"
does not exist.\nDevice "VLAN100_Network" does not exist.\nDevice
...
...
...
"VLAN100_Network" does not exist.\n';  = 1
ifup/VLAN100_Network::ERROR::2017-03-21
06:12:03,551::utils::371::root::(wrapper) Unhandled exception
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/utils.py", line 368, in
wrapper
return f(*a, **kw)
  File "/usr/lib/python2.7/site-packages/vdsm/concurrent.py", line 180, in
run
return func(*args, **kwargs)
  File
"/usr/lib/python2.7/site-packages/vdsm/network/configurators/ifcfg.py",
line 924, in _exec_ifup
_exec_ifup_by_name(iface.name, cgroup)
  File
"/usr/lib/python2.7/site-packages/vdsm/network/configurators/ifcfg.py",
line 910, in _exec_ifup_by_name
raise ConfigNetworkError(ERR_FAILED_IFUP, out[-1] if out else '')
ConfigNetworkError: (29, 'Determining IPv6 information for
VLAN100_Network... failed.')




Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt 4.1 && oVirt master ] [ 20-03-2017 ] [ 004_basic_sanity.snapshots_merge ]

2017-03-20 Thread Shlomo Ben David
Hi,


Test failed: [ 004_basic_sanity.snapshots_merge ]

Link to suspected patches: N/A
Link to Job:

   1. http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1007
   2. http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/5918


Link to all logs:

   1.
   
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_4.1/1007/artifact/exported-artifacts/basic-suit-4.1-el7/test_logs/basic-suite-4.1/post-004_basic_sanity.py/
   2.
   
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/5918/artifact/exported-artifacts/basic-suit-master-el7/test_logs/basic-suite-master/post-004_basic_sanity.py/


Error snippet from the log:



ovirtlago.testlib: ERROR: Unhandled exception in  at
0x446a758>
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 217,
in assert_equals_within
res = func()
  File
"/home/jenkins/workspace/test-repo_ovirt_experimental_4.1/ovirt-system-tests/basic-suite-4.1/test-scenarios/004_basic_sanity.py",
line 449, in 
api.vms.get(VM0_NAME).disks.get(disk_name).status.state == 'ok'
AttributeError: 'NoneType' object has no attribute 'state'
lago.utils: ERROR: Error while running thread
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/lago/utils.py", line 57, in
_ret_via_queue
queue.put({'return': func()})
  File
"/home/jenkins/workspace/test-repo_ovirt_experimental_4.1/ovirt-system-tests/basic-suite-4.1/test-scenarios/004_basic_sanity.py",
line 448, in snapshot_live_merge
lambda:
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 264,
in assert_true_within_long
assert_equals_within_long(func, True, allowed_exceptions)
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 251,
in assert_equals_within_long
func, value, LONG_TIMEOUT, allowed_exceptions=allowed_exceptions
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line 217,
in assert_equals_within
res = func()
  File
"/home/jenkins/workspace/test-repo_ovirt_experimental_4.1/ovirt-system-tests/basic-suite-4.1/test-scenarios/004_basic_sanity.py",
line 449, in 
api.vms.get(VM0_NAME).disks.get(disk_name).status.state == 'ok'
AttributeError: 'NoneType' object has no attribute 'state'




Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] [ OST Failure Report ] [ oVirt master ] [ 20-03-2017 ] [007_sd_reattach.reattach_storage_domain]

2017-03-20 Thread Shlomo Ben David
Hi,


Test failed: [ 007_sd_reattach.reattach_storage_domain ]

Link to suspected patches: N/A
Link to Job:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/5915/consoleFull
Link to all logs:
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/5915/artifact/exported-artifacts/basic-suit-master-el7/test_logs/basic-suite-master/post-007_sd_reattach.py/

Error snippet from the log:



12:01:51 [basic_suit_el7] @ Run test: 007_sd_reattach.py: ERROR (in 0:00:44)
12:01:51 [basic_suit_el7] Error occured, aborting
12:01:51 [basic_suit_el7] Traceback (most recent call last):
12:01:51 [basic_suit_el7]   File
"/usr/lib/python2.7/site-packages/ovirtlago/cmd.py", line 267, in do_run
12:01:51 [basic_suit_el7] self.cli_plugins[args.ovirtverb].do_run(args)
12:01:51 [basic_suit_el7]   File
"/usr/lib/python2.7/site-packages/lago/plugins/cli.py", line 184, in do_run
12:01:51 [basic_suit_el7] self._do_run(**vars(args))
12:01:51 [basic_suit_el7]   File
"/usr/lib/python2.7/site-packages/lago/utils.py", line 495, in wrapper
12:01:51 [basic_suit_el7] return func(*args, **kwargs)
12:01:51 [basic_suit_el7]   File
"/usr/lib/python2.7/site-packages/lago/utils.py", line 506, in wrapper
12:01:51 [basic_suit_el7] return func(*args, prefix=prefix, **kwargs)
12:01:51 [basic_suit_el7]   File
"/usr/lib/python2.7/site-packages/ovirtlago/cmd.py", line 96, in
do_ovirt_runtest
12:01:51 [basic_suit_el7] raise RuntimeError('Some tests failed')
12:01:51 [basic_suit_el7] RuntimeError: Some tests failed



Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Fwd: Verify Grade (-1) for check_product and check_target_milestone hooks

2017-02-05 Thread Shlomo Ben David
Hi All,

I'm going to apply today the verify grade (-1) for the following hooks:


   1. *check_product* - It will return a verify grade (-1) if the patch
   project is not the same as the bug product.
   2. *check_target_milestone* - It will return a verify grade (-1) if the
   patch branch major version is not the same as the bug target milestone
   major version.

Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Verify Grade (-1) for check_product and check_target_milestone hooks

2017-02-05 Thread Shlomo Ben David
Hi Tal,

I'm going to apply today the verify grade (-1) for the following hooks:


   1. *check_product* - if the patch project is not the same as the bug
   product will return a verify grade (-1).
   2. *check_target_milestone* - if the patch branch major version is not
   the same as the bug target milestone major version will return a verify
   grade (-1).

Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
RHCSA | RHCVA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] job test-repo_ovirt_experimental_4.0 (3718) failed

2017-01-23 Thread Shlomo Ben David
Hi,

Job [1] failed with the following errors (logs [2]):

{"jsonrpc": "2.0", "id": "669e1306-3206-4a64-a33f-d18176531ff8",
"error": {"message": "Storage domain does not exist:
(u'ab6c9588-d957-4be3-9862-d2596db463d9',)", "code": 358}}�
2017-01-19 12:13:51,619 DEBUG
[org.ovirt.vdsm.jsonrpc.client.internal.ResponseWorker]
(ResponseWorker) [3a116b94] Message received: {"jsonrpc": "2.0", "id":
"669e1306-3206-4a64-a33f-d18176531ff8", "error": {"message": "Storage
domain does not exist: (u'ab6c9588-d957-4be3-9862-d2596db463d9',)",
"code": 358}}
2017-01-19 12:13:51,620 ERROR
[org.ovirt.engine.core.vdsbroker.irsbroker.AttachStorageDomainVDSCommand]
(default task-27) [6a6480a4] Failed in 'AttachStorageDomainVDS' method
2017-01-19 12:13:51,624 ERROR
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(default task-27) [6a6480a4] Correlation ID: null, Call Stack: null,
Custom Event ID: -1, Message: VDSM command failed: Storage domain does
not exist: (u'ab6c9588-d957-4be3-9862-d2596db463d9',)
2017-01-19 12:13:51,624 ERROR
[org.ovirt.engine.core.vdsbroker.irsbroker.AttachStorageDomainVDSCommand]
(default task-27) [6a6480a4] Command 'AttachStorageDomainVDSCommand(
AttachStorageDomainVDSCommandParameters:{runAsync='true',
storagePoolId='fc33da6d-5da7-4005-a693-f170437c176c',
ignoreFailoverLimit='false',
storageDomainId='ab6c9588-d957-4be3-9862-d2596db463d9'})' execution
failed: IRSGenericException: IRSErrorException: Failed to
AttachStorageDomainVDS, error = Storage domain does not exist:
(u'ab6c9588-d957-4be3-9862-d2596db463d9',), code = 358
2017-01-19 12:13:51,632 DEBUG
[org.ovirt.engine.core.utils.timer.FixedDelayJobListener]
(DefaultQuartzScheduler10) [] Rescheduling
DEFAULT.org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData.hostsStorageConnectionsAndPoolMetadataRefresh#-9223372036854775793
as there is no unfired trigger.
2017-01-19 12:13:51,624 DEBUG
[org.ovirt.engine.core.vdsbroker.irsbroker.AttachStorageDomainVDSCommand]
(default task-27) [6a6480a4] Exception:
org.ovirt.engine.core.vdsbroker.irsbroker.IrsOperationFailedNoFailoverException:
IRSGenericException: IRSErrorException: Failed to
AttachStorageDomainVDS, error = Storage domain does not exist:
(u'ab6c9588-d957-4be3-9862-d2596db463d9',), code = 358
at 
org.ovirt.engine.core.vdsbroker.irsbroker.AttachStorageDomainVDSCommand.createDefaultConcreteException(AttachStorageDomainVDSCommand.java:25)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.vdsbroker.vdsbroker.BrokerCommandBase.createException(BrokerCommandBase.java:222)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.vdsbroker.vdsbroker.BrokerCommandBase.proceedProxyReturnValue(BrokerCommandBase.java:192)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.vdsbroker.irsbroker.AttachStorageDomainVDSCommand.executeIrsBrokerCommand(AttachStorageDomainVDSCommand.java:18)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand.lambda$executeVDSCommand$0(IrsBrokerCommand.java:161)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData.runInControlledConcurrency(IrsProxyData.java:248)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand.executeVDSCommand(IrsBrokerCommand.java:158)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.vdsbroker.VDSCommandBase.executeCommand(VDSCommandBase.java:73)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.dal.VdcCommandBase.execute(VdcCommandBase.java:33)
[dal.jar:]
at 
org.ovirt.engine.core.vdsbroker.ResourceManager.runVdsCommand(ResourceManager.java:451)
[vdsbroker.jar:]
at 
org.ovirt.engine.core.bll.VDSBrokerFrontendImpl.runVdsCommand(VDSBrokerFrontendImpl.java:33)
[bll.jar:]
at 
org.ovirt.engine.core.bll.CommandBase.runVdsCommand(CommandBase.java:2171)
[bll.jar:]
at 
org.ovirt.engine.core.bll.storage.StorageHandlingCommandBase.runVdsCommand(StorageHandlingCommandBase.java:657)
[bll.jar:]
at 
org.ovirt.engine.core.bll.storage.domain.AttachStorageDomainToPoolCommand.executeCommand(AttachStorageDomainToPoolCommand.java:152)
[bll.jar:]
at 
org.ovirt.engine.core.bll.CommandBase.executeWithoutTransaction(CommandBase.java:1305)
[bll.jar:]
at 
org.ovirt.engine.core.bll.CommandBase.executeActionInTransactionScope(CommandBase.java:1447)
[bll.jar:]
at 
org.ovirt.engine.core.bll.CommandBase.runInTransaction(CommandBase.java:2075)
[bll.jar:]
at 
org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInSuppressed(TransactionSupport.java:166)
[utils.jar:]
at 
org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInScope(TransactionSupport.java:105)
[utils.jar:]
at org.ovirt.engine.core.bll.CommandBase.execute(CommandBase.java:1490)
[bll.jar:]
at 
org.ovirt.engine.core.bll.CommandBase.executeAction(CommandBase.java:398)
[bll.jar:]
at org.ovirt.engine.core.bll.Backend.runAction(Backend.java:493) 
[bll.jar:]
at 

[ovirt-devel] test-repo_ovirt_experimental_3.6/4457/ job fails

2016-12-11 Thread Shlomo Ben David
Hi,

The [1] job fails with the following errors:

*12:34:20  [31mError while running thread
12:34:20 Traceback (most recent call last):
12:34:20   File "/usr/lib/python2.7/site-packages/lago/utils.py", line
55, in _ret_via_queue
12:34:20 queue.put({'return': func()})
12:34:20   File "/usr/lib/python2.7/site-packages/lago/prefix.py",
line 1108, in _collect_artifacts
12:34:20 vm.collect_artifacts(path)
12:34:20   File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py",
line 461, in collect_artifacts
12:34:20 ) for guest_path in self._artifact_paths()
12:34:20   File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py",
line 307, in extract_paths
12:34:20 return self.provider.extract_paths(paths, *args, **kwargs)
12:34:20   File "/usr/lib/python2.7/site-packages/lago/vm.py", line
196, in extract_paths
12:34:20 self._extract_paths_live(paths=paths)
12:34:20   File "/usr/lib/python2.7/site-packages/lago/vm.py", line
417, in _extract_paths_live
12:34:20 self._extract_paths_dead(paths=paths)
12:34:20   File "/usr/lib/python2.7/site-packages/lago/vm.py", line
432, in _extract_paths_dead
12:34:20 gfs_cli.launch()
12:34:20   File "/usr/lib64/python2.7/site-packages/guestfs.py", line
4731, in launch
12:34:20 r = libguestfsmod.launch (self._o)
12:34:20 RuntimeError: guestfs_launch failed.
12:34:20 This usually means the libguestfs appliance failed to start or crashed.
12:34:20 See http://libguestfs.org/guestfs-faq.1.html#debugging-libguestfs

12:34:20 or run 'libguestfs-test-tool' and post the *complete* output into a
12:34:20 bug report or message to the libguestfs mailing list. [0m
12:34:32  [36m  # [Thread-1] lago-basic-suite-3-6-storage:
[32mSuccess [0m (in 0:00:12) [0m
12:34:32  [36m  # [Thread-3] lago-basic-suite-3-6-host1:  [32mSuccess
[0m (in 0:00:13) [0m
12:34:32  [36m  # [Thread-2] lago-basic-suite-3-6-engine:  [32mSuccess
[0m (in 0:00:13) [0m
12:34:32  [36m@ Collect artifacts:  [31mERROR [0m (in 0:00:13) [0m
12:34:32  [31mError occured, aborting
12:34:32 Traceback (most recent call last):
12:34:32   File "/usr/lib/python2.7/site-packages/ovirtlago/cmd.py",
line 264, in do_run
12:34:32 self.cli_plugins[args.ovirtverb].do_run(args)
12:34:32   File
"/usr/lib/python2.7/site-packages/lago/plugins/cli.py", line 184, in
do_run
12:34:32 self._do_run(**vars(args))
12:34:32   File "/usr/lib/python2.7/site-packages/lago/utils.py", line
489, in wrapper
12:34:32 return func(*args, **kwargs)
12:34:32   File "/usr/lib/python2.7/site-packages/lago/utils.py", line
500, in wrapper
12:34:32 return func(*args, prefix=prefix, **kwargs)
12:34:32   File "/usr/lib/python2.7/site-packages/ovirtlago/cmd.py",
line 230, in do_ovirt_collect
12:34:32 prefix.collect_artifacts(output)
12:34:32   File "/usr/lib/python2.7/site-packages/lago/log_utils.py",
line 621, in wrapper
12:34:32 return func(*args, **kwargs)
12:34:32   File "/usr/lib/python2.7/site-packages/lago/prefix.py",
line 1112, in collect_artifacts
12:34:32 self.virt_env.get_vms().values(),
12:34:32   File "/usr/lib/python2.7/site-packages/lago/utils.py", line
97, in invoke_in_parallel
12:34:32 vt.join_all()
12:34:32   File "/usr/lib/python2.7/site-packages/lago/utils.py", line
55, in _ret_via_queue
12:34:32 queue.put({'return': func()})
12:34:32   File "/usr/lib/python2.7/site-packages/lago/prefix.py",
line 1108, in _collect_artifacts
12:34:32 vm.collect_artifacts(path)
12:34:32   File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py",
line 461, in collect_artifacts
12:34:32 ) for guest_path in self._artifact_paths()
12:34:32   File "/usr/lib/python2.7/site-packages/lago/plugins/vm.py",
line 307, in extract_paths
12:34:32 return self.provider.extract_paths(paths, *args, **kwargs)
12:34:32   File "/usr/lib/python2.7/site-packages/lago/vm.py", line
196, in extract_paths
12:34:32 self._extract_paths_live(paths=paths)
12:34:32   File "/usr/lib/python2.7/site-packages/lago/vm.py", line
417, in _extract_paths_live
12:34:32 self._extract_paths_dead(paths=paths)
12:34:32   File "/usr/lib/python2.7/site-packages/lago/vm.py", line
432, in _extract_paths_dead
12:34:32 gfs_cli.launch()
12:34:32   File "/usr/lib64/python2.7/site-packages/guestfs.py", line
4731, in launch
12:34:32 r = libguestfsmod.launch (self._o)
12:34:32 RuntimeError: guestfs_launch failed.
12:34:32 This usually means the libguestfs appliance failed to start or crashed.
12:34:32 See http://libguestfs.org/guestfs-faq.1.html#debugging-libguestfs

12:34:32 or run 'libguestfs-test-tool' and post the *complete* output into a
12:34:32 bug report or message to the libguestfs mailing list. [0m*


[1] -
http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_3.6/4457/consoleFull

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, 

[ovirt-devel] set_modified_hook (update)

2016-12-08 Thread Shlomo Ben David
Hi All,

This email is an update for the set_modified hook

hook: set_modified
hook goal: when patch merged, changes the bug status from POST to MODIFIED.


   - The current version of the hook doesn't checks that all previous
   patches (on the external tracker) were closed before setting the bug status
   to MODIFIED.


   - I released this [1] patch that fixes it.
   - If you have any questions/issues about this/other hooks please fill
   free to send an email to me / infra.ovirt.org

[1] - https://gerrit.ovirt.org/#/c/67512/1

Thanks for your cooperation,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] ovirt-node_ovirt-3.6_create-iso-el7_merged job failed

2016-12-07 Thread Shlomo Ben David
Hi,

The [1] job failed with the following error:

*10:21:33 Package qemu-kvm-tools is obsoleted by qemu-kvm-tools-ev,
trying to install 10:qemu-kvm-tools-ev-2.6.0-27.1.el7.x86_64 instead
10:21:48 Error creating Live CD : Failed to build transaction :
10:qemu-kvm-ev-2.6.0-27.1.el7.x86_64 requires usbredir >= 0.7.1
10:21:48 10:qemu-kvm-ev-2.6.0-27.1.el7.x86_64 requires seavgabios-bin >= 1.9.1-4
10:21:48 10:qemu-kvm-ev-2.6.0-27.1.el7.x86_64 requires ipxe-roms-qemu
>= 20160127-4
10:21:48 10:qemu-kvm-ev-2.6.0-27.1.el7.x86_64 requires libusbx >= 1.0.19
10:21:48 ERROR: ISO build failed.*


[1] -
http://jenkins.ovirt.org/job/ovirt-node_ovirt-3.6_create-iso-el7_merged/171/console

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] ovirt_4.0_he-system-tests job failed

2016-12-07 Thread Shlomo Ben David
Hi,

The [1] job failed with the following error:

2016-12-07 02:40:43,805::ssh.py::ssh::96::lago.ssh::DEBUG::Command
8b25fff2 on lago-he-basic-suite-4-0-host0  errors:
 Error: Package:
ovirt-hosted-engine-setup-2.0.4.2-0.0.master.20161201083322.gita375924.el7.centos.noarch
(alocalsync)
   Requires: ovirt-engine-sdk-python >= 3.6.3.0


[1] - http://jenkins.ovirt.org/job/ovirt_4.0_he-system-tests/584/console


Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] test-repo_ovirt_experimental_master job fails

2016-12-04 Thread Shlomo Ben David
Hi All,

The [1] job is failing since yesterday (Dec 3, 2016 6:29 PM) with the
following error:

*15:35:52*  [36m@ Run test: 004_basic_sanity.py:  [31mERROR [0m (in
0:04:44) [0m*15:35:52*  [31mError occured, aborting*15:35:52*
Traceback (most recent call last):*15:35:52*   File
"/usr/lib/python2.7/site-packages/ovirtlago/cmd.py", line 264, in
do_run*15:35:52*
self.cli_plugins[args.ovirtverb].do_run(args)*15:35:52*   File
"/usr/lib/python2.7/site-packages/lago/plugins/cli.py", line 184, in
do_run*15:35:52* self._do_run(**vars(args))*15:35:52*   File
"/usr/lib/python2.7/site-packages/lago/utils.py", line 489, in
wrapper*15:35:52* return func(*args, **kwargs)*15:35:52*   File
"/usr/lib/python2.7/site-packages/lago/utils.py", line 500, in
wrapper*15:35:52* return func(*args, prefix=prefix,
**kwargs)*15:35:52*   File
"/usr/lib/python2.7/site-packages/ovirtlago/cmd.py", line 105, in
do_ovirt_runtest*15:35:52* raise RuntimeError('Some tests
failed')*15:35:52* RuntimeError: Some tests failed [0m



>From the test results:



Traceback (most recent call last):
  File "/usr/lib64/python2.7/unittest/case.py", line 369, in run
testMethod()
  File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line
138, in wrapped_test
return test()
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line
59, in wrapper
return func(get_test_prefix(), *args, **kwargs)
  File 
"/home/jenkins/workspace/test-repo_ovirt_experimental_master/ovirt-system-tests/basic-suite-master/test-scenarios/004_basic_sanity.py",
line 339, in vm_run
lambda: api.vms.get(VM0_NAME).status.state == 'up',
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line
231, in assert_true_within_short
allowed_exceptions=allowed_exceptions,
  File "/usr/lib/python2.7/site-packages/ovirtlago/testlib.py", line
223, in assert_true_within
raise AssertionError('Timed out after %s seconds' % timeout)
AssertionError: Timed out after 180 seconds


Could you guide me please who should handle this issue?


[1] - http://jenkins.ovirt.org/job/test-repo_ovirt_experimental_master/


Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Updating email configuration

2016-10-31 Thread Shlomo Ben David
Hi All,

Due to some restrictions on our email server we need to update our email
configuration to increase the security on it.
To apply the new configuration i will reboot the server today 31/10/2016 at
20:00 IST.

*Downtime EST: ~5 minutes*

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] [vdsm] branch ovirt-4.0.5 created

2016-10-30 Thread Shlomo Ben David
Hi Eyal,

Most of the hooks are updated not to use the STABLE_BRANCHES parameter, but
there are still few hooks that are using this parameter such as:
'patchset-created.warn_if_not_merged_to_previous_branch' hook.

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1

On Sun, Oct 30, 2016 at 5:25 PM, Eyal Edri  wrote:

>
>
> On Mon, Oct 10, 2016 at 6:24 PM, Francesco Romani 
> wrote:
>
>> - Original Message -
>> > From: "Dan Kenigsberg" 
>> > To: "Francesco Romani" 
>> > Cc: "Nir Soffer" , devel@ovirt.org
>> > Sent: Monday, October 10, 2016 5:11:26 PM
>> > Subject: Re: [vdsm] branch ovirt-4.0.5 created
>> >
>> > On Mon, Oct 10, 2016 at 10:30:49AM -0400, Francesco Romani wrote:
>> > > Hi everyone,
>> > >
>> > > this time I choose to create the ovirt-4.0.5 branch.
>> > > I already merged some patches for 4.0.6.
>> > >
>> > > Unfortunately I branched a bit too early (from last tag :))
>> > >
>> > > So patches
>> > > https://gerrit.ovirt.org/#/c/65303/1
>> > > https://gerrit.ovirt.org/#/c/65304/1
>> > > https://gerrit.ovirt.org/#/c/65305/1
>> > >
>> > > Should be trivially mergeable - the only thing changed from ovirt-4.0
>> > > counterpart
>> > > is the change-id. Please have a quick look just to doublecheck.
>> >
>> > Change-Id should be the same for a master patch and all of its backport.
>> > It seems that it was NOT changed, at least for
>> > https://gerrit.ovirt.org/#/q/I5cea6ec71c913d74d95317ff7318259d64b40969
>> > which is a GOOD thing.
>>
>> Yes, sorry, indeed it is (and indeed it should not change).
>>
>> > I think we want to enable CI on the new 4.0.5 branch, right? Otherwise
>> > we'd need to fake the CI+1 flag until 4.0.5 is shipped.
>>
>> We should, but it is not urgently needed - just regular priority.
>> For the aforementioned first three patches especially I'm just overly
>> cautious.
>>
>>
> Was CI enabled for 4.0.5 branch?
> Adding infra as well.
>
> Shlomi, Did we enabled the regex for stable branch already and we don't
> need to manually update conf files?
>
>
>
>> --
>> Francesco Romani
>> Red Hat Engineering Virtualization R & D
>> Phone: 8261328
>> IRC: fromani
>> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>>
>>
>
>
> --
> Eyal Edri
> Associate Manager
> RHV DevOps
> EMEA ENG Virtualization R
> Red Hat Israel
>
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Fwd: Updating permissions on gerrit.ovirt.org

2016-09-25 Thread Shlomo Ben David
Hi All,

Today i will perform some permission changes on the ovirt-engine project.
If you'll have any permissions issues with the ovirt-engine project please
let me or in...@ovirt.org to know and we'll handle it ASAP.

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Updating permissions on gerrit.ovirt.org

2016-09-20 Thread Shlomo Ben David
Hi All,

On the upcoming days i'm going to update permissions on gerrit.ovirt.org
server.
The changes shouldn't reflect on the current state but if you'll encounter
with any permission issues or other issues related to gerrit.ovirt.org
server please let me or in...@ovirt.org to know and we'll handle it ASAP.

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] I have this issue when trying to connect to gerrit

2016-09-06 Thread Shlomo Ben David
Hi Maokexu,

Its seems that there is some iptables/firewall issue on your machine.
Please make sure that port 29418 (tcp) is open.

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1

On Tue, Sep 6, 2016 at 10:52 AM, Eyal Edri  wrote:

> Adding infra.
>
> On Tue, Sep 6, 2016 at 10:01 AM, maok...@126.com  wrote:
>
>> hi,all:
>> I have this issue when trying to connect to gerrit:
>> [maokx@maokx ~]$ ssh -vv gerrit.ovirt.org
>> OpenSSH_6.6.1, OpenSSL 1.0.1e-fips 11 Feb 2013
>> debug1: Reading configuration data /home/maokx/.ssh/config
>> debug1: /home/maokx/.ssh/config line 1: Applying options for
>> gerrit.ovirt.org
>> debug1: Reading configuration data /etc/ssh/ssh_config
>> debug1: /etc/ssh/ssh_config line 56: Applying options for *
>> debug2: ssh_connect: needpriv 0
>> debug1: Connecting to gerrit.ovirt.org [107.22.212.69] port 29418.
>> debug1: connect to address 107.22.212.69 port 29418: Connection timed out
>> ssh: connect to host gerrit.ovirt.org port 29418: Connection timed out
>>
>> [root@maokx ~]# ping gerrit.ovirt.org
>> PING gerrit.ovirt.org (107.22.212.69) 56(84) bytes of data.
>> 64 bytes from gerrit.ovirt.org (107.22.212.69): icmp_seq=1 t
>> tl=25 time=272 ms
>> 64 bytes from gerrit.ovirt.org (107.22.212.69): icmp_seq=2 t
>> tl=25 time=285 ms
>> 64 bytes from gerrit.ovirt.org (107.22.212.69): icmp_seq=3 t
>> tl=25 time=275 ms
>> 64 bytes from gerrit.ovirt.org (107.22.212.69): icmp_seq=4 t
>> tl=25 time=270 ms
>> ^C
>> --- gerrit.ovirt.org ping statistics ---
>> 4 packets transmitted, 4 received, 0% packet loss, time 3006ms
>> rtt min/avg/max/mdev = 270.741/275.835/285.228/5.694 ms
>> [root@maokx ~]# telnet gerrit.ovirt.org 29418
>> Trying 107.22.212.69...
>> telnet: connect to address 107.22.212.69: Connection timed out
>> [root@maokx ~]#
>>
>> --
>> maok...@126.com
>>
>> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>
>
>
> --
> Eyal Edri
> Associate Manager
> RHV DevOps
> EMEA ENG Virtualization R
> Red Hat Israel
>
> phone: +972-9-7692018
> irc: eedri (on #tlv #rhev-dev #rhev-integ)
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] ovirt_master_system-tests job fail

2016-09-05 Thread Shlomo Ben David
Hi,

Job: ovirt_master_system-tests [1] is failing with the following error:

> end captured
logging << -">

The failure appeared after job #472 on Sep 2, 2016 7:31 AM.

[1] -
http://jenkins.ovirt.org/view/oVirt%20system%20tests/job/ovirt_master_system-tests/

Need your assistance to solve this issue :)

Thanks in advanced,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] Update gerrit plugins (oauth, avatars-gravatar)

2016-09-04 Thread Shlomo Ben David
Hi all,

The update completed successfully.
The server is up and running with the new plugins.

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1

On Sun, Sep 4, 2016 at 7:41 PM, Shlomo Ben David <sbend...@redhat.com>
wrote:

> Hi,
>
> Today 04/09/2016 at 23:00 i'm planning to update gerrit-oauth plugin from v0.3
> ==> v2.11.3
>
> In addition i will add the new avatars-gravatar plugin (v2.11)
>
> Update Duration: ~10 min
> During the update the gerrit.ovirt.org server won't be available
>
> An update email will be sent when done.
>
> Best Regards,
>
> Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
> RHCSA | RHCE
> IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)
>
> OPEN SOURCE - 1 4 011 && 011 4 1
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] Update gerrit plugins (oauth, avatars-gravatar)

2016-09-04 Thread Shlomo Ben David
Hi,

Today 04/09/2016 at 23:00 i'm planning to update gerrit-oauth plugin from v0.3
==> v2.11.3

In addition i will add the new avatars-gravatar plugin (v2.11)

Update Duration: ~10 min
During the update the gerrit.ovirt.org server won't be available

An update email will be sent when done.

Best Regards,

Shlomi Ben-David | DevOps Engineer | Red Hat ISRAEL
RHCSA | RHCE
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] Alternatives to automatically move bugs to MODIFIED

2016-08-17 Thread Shlomo Ben David
Hi All,

I'm currently working to solve this issue.
I will update you when done :)

Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
Red Hat Certified System Administrator | Red Hat Certified Engineer
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1

On Wed, Aug 17, 2016 at 10:10 AM, Yedidyah Bar David 
wrote:

> Hi all,
>
> We currently have a bot that automatically moves bugs from POST to MODIFIED
> if all linked patches on gerrit are merged.
>
> It happened to me personally several times that this was a wrong thing to
> do,
> either because a new patch was still needed but not pushed yet, or because
> an existing patch should have been back-ported to another branch and wasn't
> yet. Since I usually pay more attention to my bug in POST, I sometimes
> missed
> this and handled the missing patches (backports, usually) later than I
> could
> if left on POST.
>
> I have a feeling I am not the only one. So I suggest to stop doing this.
>
> I can think of several alternatives:
>
> 1. Do nothing. I think that's reasonable - I think most people pay more
> attention to POST bugs anyway.
>
> 2. Set needinfo on bug owner.
>
> 3. Send some alert email to relevant people (bug owner, existing patches
> owners,
> perhaps others - e.g. reviewers of existing patches, perhaps those
> that actually reviewed, etc.). Need to think how to make it not too
> annoying for others but
> still effective also if owner is on long PTO or something like that. New
> flag
> doesn't have to be very specific - can be called something like 'attention
> needed' or something like that.
>
> 4. Add a new flag for that and set it. This will allow easier
> filtering/reporting.
>
> What do you think?
> --
> Didi
> ___
> Devel mailing list
> Devel@ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] gerrit.ovirt.org migration

2016-07-10 Thread Shlomo Ben David
Hi All,

The gerrit.ovirt.org migration process completed successfully.
The server is up and running.

Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1

On Sun, Jul 10, 2016 at 5:35 PM, Shlomo Ben David <sbend...@redhat.com>
wrote:

> Hi All,
>
> Today we are planning to migrate gerrit.ovirt.org server to a new bigger
> instance on Amazon to improve performance.
>
>- The migration will start at 18:00 IDT
>- Estimate migration end time 19:00 IDT.
>
>
>- During the migration the server will not be available (you won't be
>able to send patches or review code)
>
>
>- The server will be down for about an hour, but if we'll be able to
>restore it before we'll let you know.
>
>
>- An email will be sent at the end of the migration process.
>
>
> Best Regards,
>
> Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
> Phone: +972-54-8008858
> IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)
>
> OPEN SOURCE - 1 4 011 && 011 4 1
>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] gerrit.ovirt.org migration

2016-07-10 Thread Shlomo Ben David
Hi All,

Today we are planning to migrate gerrit.ovirt.org server to a new bigger
instance on Amazon to improve performance.

   - The migration will start at 18:00 IDT
   - Estimate migration end time 19:00 IDT.


   - During the migration the server will not be available (you won't be
   able to send patches or review code)


   - The server will be down for about an hour, but if we'll be able to
   restore it before we'll let you know.


   - An email will be sent at the end of the migration process.


Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
Phone: +972-54-8008858
IRC: shlomibendavid (on #rhev-integ, #rhev-dev, #rhev-ci)

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

Re: [ovirt-devel] Undelivered mail warnings from Gerrit

2016-06-15 Thread Shlomo Ben David
Hi,

I've replaced the email for Vdsm Patches
from: vdsm-patc...@fedorahosted.org <'vdsm-patc...@fedorahosted.org';>
to: vdsm-patc...@lists.fedorahosted.org <'vdsm-patc...@fedorahosted.org';>

Best Regards,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
Phone: +972-54-8008858
IRC: shlomibendavid

OPEN SOURCE - 1 4 011 && 011 4 1

On Wed, Jun 15, 2016 at 12:11 PM, Eyal Edri  wrote:

> We found the problem.
> Fedora changed their domain name from fedorahosted.org to
> lists.fedorahosted.org.
>
> We are working now to change it in gerrit db.
> On Jun 15, 2016 12:02 PM, "Ramesh Nachimuthu"  wrote:
>
>>
>>
>>
>>
>> - Original Message -
>> > From: "Adam Litke" 
>> > To: "Tomáš Golembiovský" 
>> > Cc: devel@ovirt.org
>> > Sent: Wednesday, June 15, 2016 1:05:13 AM
>> > Subject: Re: [ovirt-devel] Undelivered mail warnings from Gerrit
>> >
>> > On 02/06/16 11:39 +0200, Tomáš Golembiovský wrote:
>> > >Hi,
>> > >
>> > >for the last two weeks I've been getting lots of warnings about
>> undelivered
>> > >mail from Gerrit. The importnat thing in the message being:
>> > >
>> > >The original message was received at Wed, 1 Jun 2016 14:57:54 -0400
>> > >from gerrit.ovirt.org [127.0.0.1]
>> > >
>> > >- Transcript of session follows -
>> > >... Deferred: Connection timed out
>> with
>> > >hosted-lists01.fedoraproject.org.
>> > >Warning: message still undelivered after 4 hours
>> > >Will keep trying until message is 5 days old
>> > >
>> > >
>> > >Anyone else experiencing the same problem? Is this being worked on?
>> >
>> > It's affecting me quite severely also.
>> >
>>
>> I am also facing this issue many times in a day.
>>
>> Regards,
>> Ramesh
>> > --
>> > Adam Litke
>> > ___
>> > Devel mailing list
>> > Devel@ovirt.org
>> > http://lists.ovirt.org/mailman/listinfo/devel
>> ___
>> Devel mailing list
>> Devel@ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>>
>>
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel

[ovirt-devel] gerrit.ovirt.org projects

2016-05-17 Thread Shlomo Ben David
Hi,

I have a question about projects on gerrit.ovirt.org.
I'm working on cleaning up the gerrit.ovirt.org and updating the
replication process to github, I've noticed that the projects below weren't
updated > ~6 months.

Do you know which of the below projects is still relevant for replication
(sync to github)?

ovirt-register
test
chrooter
gluster-nagios-monitoring
ovirt-container-node
ovirt-container-engine
ovirt-engine-sdk-js
vdsm-arch-dependencies
ovirt-node-dbus-backend
samples-portals
jasperreports-server-rpm
ovirt-node-tests
ovirt-engine-sdk-tests
Node
ovirt-tools-common-python
mediawiki-example


Thanks in advanced,

Shlomi Ben-David | Software Engineer | Red Hat ISRAEL
Phone: +972-54-8008858
IRC: sbendavi

OPEN SOURCE - 1 4 011 && 011 4 1
___
Devel mailing list
Devel@ovirt.org
http://lists.ovirt.org/mailman/listinfo/devel