See <http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/1/>

------------------------------------------
[...truncated 5712 lines...]
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (default) @ cloud-developer ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] >>> exec-maven-plugin:1.2.1:java (create-schema-simulator) @ 
cloud-developer >>>
[INFO] 
[INFO] <<< exec-maven-plugin:1.2.1:java (create-schema-simulator) @ 
cloud-developer <<<
[INFO] 
[INFO] --- exec-maven-plugin:1.2.1:java (create-schema-simulator) @ 
cloud-developer ---
log4j:WARN No appenders could be found for logger 
(org.springframework.core.env.StandardEnvironment).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.
========> WARNING: Provided file does not exist: 
<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/developer/../utils/conf/db.properties.override>
========> Initializing database=simulator with host=localhost port=3306 
username=cloud password=cloud
============> Running query: drop database if exists `simulator`
============> Running query: create database `simulator`
============> Running query: GRANT ALL ON simulator.* to 'cloud'@`localhost` 
identified by 'cloud'
============> Running query: GRANT ALL ON simulator.* to 'cloud'@`%` identified 
by 'cloud'
========> Processing SQL file at 
<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/developer/target/db/create-schema-simulator.sql>
========> Processing SQL file at 
<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/developer/target/db/templates.simulator.sql>
========> Processing SQL file at 
<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/developer/target/db/hypervisor_capabilities.simulator.sql>
========> Processing upgrade: com.cloud.upgrade.DatabaseUpgradeChecker
[INFO] 
[INFO] --- maven-site-plugin:3.1:attach-descriptor (attach-descriptor) @ 
cloud-developer ---
[INFO] 
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ 
cloud-developer ---
[INFO] Installing 
<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/developer/pom.xml>
 to 
/var/lib/jenkins/.m2/repository/org/apache/cloudstack/cloud-developer/4.3.2-SNAPSHOT/cloud-developer-4.3.2-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16.861s
[INFO] Finished at: Mon Nov 24 04:31:14 EST 2014
[INFO] Final Memory: 31M/152M
[INFO] ------------------------------------------------------------------------
[simulator-singlerun-4.3] $ /bin/bash -x /tmp/hudson1066162609611836507.sh
+ jps -l
+ grep -q Launcher
+ rm -f xunit.xml
+ echo ''
+ rm -rf /tmp/MarvinLogs
+ echo Check for initialization of the management server
Check for initialization of the management server
+ COUNTER=0
+ SERVER_PID=30211
+ '[' 0 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ mvn -P systemvm,simulator -pl :cloud-client-ui jetty:run
+ sleep 5
+ COUNTER=1
+ '[' 1 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=2
+ '[' 2 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=3
+ '[' 3 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=4
+ '[' 4 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=5
+ '[' 5 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=6
+ '[' 6 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=7
+ '[' 7 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=8
+ '[' 8 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=9
+ '[' 9 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=10
+ '[' 10 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=11
+ '[' 11 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=12
+ '[' 12 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=13
+ '[' 13 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=14
+ '[' 14 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=15
+ '[' 15 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=16
+ '[' 16 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=17
+ '[' 17 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=18
+ '[' 18 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=19
+ '[' 19 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=20
+ '[' 20 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=21
+ '[' 21 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=22
+ '[' 22 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=23
+ '[' 23 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=24
+ '[' 24 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=25
+ '[' 25 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ sleep 5
+ COUNTER=26
+ '[' 26 -lt 44 ']'
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ break
+ grep -q 'Management server node 127.0.0.1 is up' jetty-output.out
+ echo Started OK pid 30211
Started OK pid 30211
+ sleep 20
+ export 
PYTHONPATH=<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/tools/marvin>
+ 
PYTHONPATH=<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/tools/marvin>
+ python2.6 tools/marvin/marvin/deployDataCenter.py -i setup/dev/advanced.cfg
/usr/lib/python2.6/site-packages/pycrypto-2.6-py2.6-linux-x86_64.egg/Crypto/Util/number.py:57:
 PowmInsecureWarning: Not using mpz_powm_sec.  You should rebuild using libgmp 
>= 5 to avoid timing attack vulnerability.
  _warn("Not using mpz_powm_sec.  You should rebuild using libgmp >= 5 to avoid 
timing attack vulnerability.", PowmInsecureWarning)
+ sleep 60
+ /usr/local/bin/nosetests-2.7 -v --with-marvin 
--marvin-config=setup/dev/advanced.cfg --with-xunit --xunit-file=xunit.xml -a 
tags=advanced,required_hardware=false -w test/integration/smoke

 Exception Occurred Under __deployDC : Execute cmd: createzone failed, due to: 
errorCode: 431, errorText:A zone with that name already exists. Please specify 
a unique zone name.
Traceback (most recent call last):
  File "/usr/local/bin/nosetests-2.7", line 9, in <module>
    load_entry_point('nose==1.3.3', 'console_scripts', 'nosetests-2.7')()
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/core.py", 
line 121, in __init__
    **extra_args)
  File "/usr/local/lib/python2.7/unittest/main.py", line 95, in __init__
    self.runTests()
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/core.py", 
line 207, in runTests
    result = self.testRunner.run(self.test)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/core.py", 
line 62, in run
    test(result)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 176, in __call__
    return self.run(*arg, **kw)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 223, in run
    test(orig)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 176, in __call__
    return self.run(*arg, **kw)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 223, in run
    test(orig)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 176, in __call__
    return self.run(*arg, **kw)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 223, in run
    test(orig)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 176, in __call__
    return self.run(*arg, **kw)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 223, in run
    test(orig)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 176, in __call__
    return self.run(*arg, **kw)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 223, in run
    test(orig)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 176, in __call__
    return self.run(*arg, **kw)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 223, in run
    test(orig)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 176, in __call__
    return self.run(*arg, **kw)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/suite.py", 
line 223, in run
    test(orig)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/case.py", 
line 45, in __call__
    return self.run(*arg, **kwarg)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/case.py", 
line 138, in run
    result.addError(self, err)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/proxy.py", 
line 124, in addError
    plugin_handled = plugins.handleError(self.test, err)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/plugins/manager.py",
 line 99, in __call__
    return self.call(*arg, **kw)
  File 
"/usr/local/lib/python2.7/site-packages/nose-1.3.3-py2.7.egg/nose/plugins/manager.py",
 line 167, in simple
    result = meth(*arg, **kw)
  File 
"<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/tools/marvin/marvin/marvinPlugin.py";,>
 line 155, in handleError
    self.tcRunLogger.fatal("%s: %s: %s" %
AttributeError: 'NoneType' object has no attribute 'fatal'
+ mvn -P systemvm,simulator -pl :cloud-client-ui jetty:stop
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache CloudStack Client UI 4.3.2-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-jetty-plugin:6.1.26:stop (default-cli) @ cloud-client-ui ---
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.417s
[INFO] Finished at: Mon Nov 24 04:37:45 EST 2014
[INFO] Final Memory: 19M/115M
[INFO] ------------------------------------------------------------------------
+ sleep 10
+ kill -KILL 30211
/tmp/hudson1066162609611836507.sh: line 44: kill: (30211) - No such process
[locks-and-latches] Releasing all the locks
[locks-and-latches] All the locks released
[xUnit] [INFO] - Starting to record.
[xUnit] [INFO] - Processing JUnit
[xUnit] [INFO] - [JUnit] - 1 test report file(s) were found with the pattern 
'xunit.xml' relative to 
'<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/'> for the 
testing framework 'JUnit'.
[xUnit] [ERROR] - The result file 
'<http://jenkins.buildacloud.org/job/simulator-singlerun-4.3/ws/xunit.xml'> for 
the metric 'JUnit' is not valid. The result file has been skipped.
[xUnit] [INFO] - Failing BUILD because 'set build failed if errors' option is 
activated.
[xUnit] [INFO] - There are errors when processing test results.
[xUnit] [INFO] - Skipping tests recording.
[xUnit] [INFO] - Stop build.
Build step 'Publish xUnit test result report' changed build result to FAILURE

Reply via email to