Giuseppe Lavagetto has submitted this change and it was merged. ( 
https://gerrit.wikimedia.org/r/352595 )

Change subject: Fix data structure for jobs
......................................................................


Fix data structure for jobs

Endpoints are not unique keys.

Change-Id: I413d67e1fcdf63a9e1b81510d674ea3bf6e3f2e3
---
M debian/changelog
M servicechecker/swagger.py
2 files changed, 10 insertions(+), 8 deletions(-)

Approvals:
  Giuseppe Lavagetto: Looks good to me, approved
  jenkins-bot: Verified
  Volans: Looks good to me, but someone else must approve



diff --git a/debian/changelog b/debian/changelog
index d81e08b..9cbdf07 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,4 +1,4 @@
-service-checker (0.1.0) UNRELEASED; urgency=medium
+service-checker (0.1.0) jessie-wikimedia; urgency=medium
 
   * Asyncronous url fetching using gevent
   * Added ability to report timing data to statsd
diff --git a/servicechecker/swagger.py b/servicechecker/swagger.py
index e9312c0..9cb0d6c 100755
--- a/servicechecker/swagger.py
+++ b/servicechecker/swagger.py
@@ -142,12 +142,14 @@
         status = 'OK'
         idx = self.nagios_codes.index(status)
         # Spawn the downloaders
-        checks = {ep: {'data': data, 'job': gevent.spawn(self._check_endpoint, 
ep, data)}
-                  for ep, data in self.get_endpoints()}
-        gevent.joinall([v['job'] for v in checks.values()], self.nrpe_timeout 
- 1)
+        checks = [{'ep': ep, 'data': data, 'job': 
gevent.spawn(self._check_endpoint, ep, data)}
+                  for ep, data in self.get_endpoints()]
+        gevent.joinall([v['job'] for v in checks], self.nrpe_timeout - 2)
 
-        for endpoint, v in checks.items():
+        for v in checks:
+            endpoint = v['ep']
             data = v['data']
+            title = data.get('title', "test for {}".format(endpoint))
             job = v['job']
             # Endpoint fetching failed or timed out.
             if not job.successful():
@@ -158,7 +160,7 @@
                 else:
                     res.append(
                         '{ep} ({title}) timed out before a response was 
received'.format(
-                            ep=endpoint, title=data.get('title', 'no title')
+                            ep=endpoint, title=title,
                         )
                     )
             else:
@@ -167,7 +169,7 @@
                 if ep_status != 'OK':
                     res.append(
                         "{ep} ({title}) is {status}: {message}".format(
-                            ep=endpoint, title=data.get('title', 'no title'), 
status=ep_status,
+                            ep=endpoint, title=title, status=ep_status,
                             message=msg
                         )
                     )
@@ -244,7 +246,7 @@
         """
         try:
             url = self.tpl_url.realize(self.url_parameters)
-            label = url.replace(self.base_url, '', 1).replace('/', '_')
+            label = url.replace(self.base_url, '', 1).replace('.', 
'_').replace('/', '_')
             with time_to_statsd(label):
                 r = fetch_url(
                     client,

-- 
To view, visit https://gerrit.wikimedia.org/r/352595
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: merged
Gerrit-Change-Id: I413d67e1fcdf63a9e1b81510d674ea3bf6e3f2e3
Gerrit-PatchSet: 2
Gerrit-Project: operations/software/service-checker
Gerrit-Branch: master
Gerrit-Owner: Giuseppe Lavagetto <glavage...@wikimedia.org>
Gerrit-Reviewer: Giuseppe Lavagetto <glavage...@wikimedia.org>
Gerrit-Reviewer: Volans <rcocci...@wikimedia.org>
Gerrit-Reviewer: jenkins-bot <>

_______________________________________________
MediaWiki-commits mailing list
MediaWiki-commits@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits

Reply via email to