Xqt has uploaded a new change for review.

  https://gerrit.wikimedia.org/r/129227

Change subject: (bug 64186) Decrease exhausing memory usage, speed up processing
......................................................................

(bug 64186) Decrease exhausing memory usage, speed up processing

update and syncronized from/with core

- delete local list after we are done with it
- decrease processing speed: list.remove() consumes a lot of time
  because lists are not indexed and removing an item  needs to
  search inside the sequence. There is an easy way for more speed
  when the parent category is just skipped inside the loop.

Change-Id: Ic7f8f3d759bf5e9d9895c38a01d9d128775eadd9
---
M scripts/category.py
1 file changed, 14 insertions(+), 20 deletions(-)


  git pull ssh://gerrit.wikimedia.org:29418/pywikibot/core 
refs/changes/27/129227/1

diff --git a/scripts/category.py b/scripts/category.py
index e12fd6c..c3f423f 100755
--- a/scripts/category.py
+++ b/scripts/category.py
@@ -879,38 +879,32 @@
         result = u'#' * currentDepth + ' '
         result += cat.title(asLink=True, textlink=True, withNamespace=False)
         result += ' (%d)' % len(self.catDB.getArticles(cat))
-        # We will remove an element of supercats, but need the original set
-        # later, so we create a list from the catDB.getSupercats(cat) set
-        supercats = list(self.catDB.getSupercats(cat))
+        if currentDepth < self.maxDepth / 2:
+            # noisy dots
+            pywikibot.output('.', newline=False)
         # Find out which other cats are supercats of the current cat
-        try:
-            supercats.remove(parent)
-        except:
-            pass
-        if supercats:
-            if currentDepth < self.maxDepth / 2:
-                # noisy dots
-                pywikibot.output('.', newline=False)
-            supercat_names = []
-            for i, cat in enumerate(supercats):
-                # create a list of wiki links to the supercategories
+        supercat_names = []
+        for cat in self.catDB.getSupercats(cat):
+            # create a list of wiki links to the supercategories
+            if cat != parent:
                 supercat_names.append(cat.title(asLink=True,
                                                 textlink=True,
                                                 withNamespace=False))
-                # print this list, separated with commas, using translations
-                # given in also_in_cats
+        if supercat_names:
+            # print this list, separated with commas, using translations
+            # given in also_in_cats
             result += ' ' + i18n.twtranslate(self.site, 'category-also-in',
                                              {'alsocat': ', '.join(
                                                  supercat_names)})
+        del supercat_names
         result += '\n'
         if currentDepth < self.maxDepth:
             for subcat in self.catDB.getSubcats(cat):
                 # recurse into subdirectories
                 result += self.treeview(subcat, currentDepth + 1, parent=cat)
-        else:
-            if self.catDB.getSubcats(cat):
-                # show that there are more categories beyond the depth limit
-                result += '#' * (currentDepth + 1) + ' [...]\n'
+        elif self.catDB.getSubcats(cat):
+            # show that there are more categories beyond the depth limit
+            result += '#' * (currentDepth + 1) + ' [...]\n'
         return result
 
     def run(self):

-- 
To view, visit https://gerrit.wikimedia.org/r/129227
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: newchange
Gerrit-Change-Id: Ic7f8f3d759bf5e9d9895c38a01d9d128775eadd9
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Xqt <i...@gno.de>

_______________________________________________
MediaWiki-commits mailing list
MediaWiki-commits@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits

Reply via email to