Hello community,

here is the log from the commit of package python-cachey for openSUSE:Factory 
checked in at 2020-03-31 17:14:12
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-cachey (Old)
 and      /work/SRC/openSUSE:Factory/.python-cachey.new.3160 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-cachey"

Tue Mar 31 17:14:12 2020 rev:2 rq:789775 version:0.2.1

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-cachey/python-cachey.changes      
2019-01-25 22:44:38.807133426 +0100
+++ /work/SRC/openSUSE:Factory/.python-cachey.new.3160/python-cachey.changes    
2020-03-31 17:14:25.879596585 +0200
@@ -1,0 +2,11 @@
+Mon Mar 30 09:17:57 UTC 2020 - Marketa Calabkova <mcalabk...@suse.com>
+
+- Update to 0.2.1
+  * Change links to blaze org
+  * Fix Cache.clear()
+  * Add a resize method.
+  * make default data dict inside __init__
+  * test against 3.7 and 3.8, drop 2.7
+- Drop upstreamed patch fix_cache_clear.patch
+
+-------------------------------------------------------------------

Old:
----
  cachey-0.1.1.tar.gz
  fix_cache_clear.patch

New:
----
  cachey-0.2.1.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-cachey.spec ++++++
--- /var/tmp/diff_new_pack.G195S1/_old  2020-03-31 17:14:27.123597375 +0200
+++ /var/tmp/diff_new_pack.G195S1/_new  2020-03-31 17:14:27.127597377 +0200
@@ -1,7 +1,7 @@
 #
 # spec file for package python-cachey
 #
-# Copyright (c) 2019 SUSE LINUX GmbH, Nuernberg, Germany.
+# Copyright (c) 2020 SUSE LLC
 #
 # All modifications and additions to the file contributed by third parties
 # remain the property of their copyright owners, unless otherwise agreed
@@ -16,17 +16,16 @@
 #
 
 
+%define skip_python2 1
 %{?!python_module:%define python_module() python-%{**} python3-%{**}}
 Name:           python-cachey
-Version:        0.1.1
+Version:        0.2.1
 Release:        0
 Summary:        A Python cache mindful of computation/storage costs
 License:        BSD-3-Clause
 Group:          Development/Languages/Python
 URL:            http://github.com/mrocklin/cachey/
 Source:         
https://files.pythonhosted.org/packages/source/c/cachey/cachey-%{version}.tar.gz
-# PATCH-FIX-UPSTREAM fix_cache_clear.patch - fix unit test error
-Patch0:         fix_cache_clear.patch
 BuildRequires:  %{python_module HeapDict}
 BuildRequires:  %{python_module pytest}
 BuildRequires:  %{python_module setuptools}
@@ -54,7 +53,6 @@
 
 %prep
 %setup -q -n cachey-%{version}
-%patch0 -p1
 
 %build
 %python_build

++++++ cachey-0.1.1.tar.gz -> cachey-0.2.1.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cachey-0.1.1/PKG-INFO new/cachey-0.2.1/PKG-INFO
--- old/cachey-0.1.1/PKG-INFO   2015-08-14 20:44:29.000000000 +0200
+++ new/cachey-0.2.1/PKG-INFO   2020-03-11 16:33:42.000000000 +0100
@@ -1,10 +1,10 @@
-Metadata-Version: 1.0
+Metadata-Version: 2.1
 Name: cachey
-Version: 0.1.1
+Version: 0.2.1
 Summary: Caching mindful of computation/storage costs
-Home-page: http://github.com/mrocklin/cachey/
-Author: Matthew Rocklin
-Author-email: mrock...@gmail.com
+Home-page: http://github.com/dask/cachey/
+Maintainer: Matthew Rocklin
+Maintainer-email: mrock...@gmail.com
 License: BSD
 Description: Caching for Analytic Computations
         ---------------------------------
@@ -12,8 +12,8 @@
         Humans repeat stuff.  Caching helps.
         
         Normal caching policies like LRU aren't well suited for analytic 
computations
-        where both the cost of recomputation and the cost of storge routinely 
vary by
-        one milllion or more.  Consider the following computations
+        where both the cost of recomputation and the cost of storage routinely 
vary by
+        one million or more.  Consider the following computations
         
         ```python
         # Want this
@@ -64,3 +64,15 @@
         Cachey is new and not robust.
         
 Platform: UNKNOWN
+Classifier: Development Status :: 4 - Beta
+Classifier: Intended Audience :: Developers
+Classifier: Intended Audience :: Science/Research
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Operating System :: OS Independent
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Topic :: Scientific/Engineering
+Requires-Python: >=3.6
+Description-Content-Type: text/markdown
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cachey-0.1.1/README.md new/cachey-0.2.1/README.md
--- old/cachey-0.1.1/README.md  2015-08-03 02:51:53.000000000 +0200
+++ new/cachey-0.2.1/README.md  2020-03-11 16:28:41.000000000 +0100
@@ -4,8 +4,8 @@
 Humans repeat stuff.  Caching helps.
 
 Normal caching policies like LRU aren't well suited for analytic computations
-where both the cost of recomputation and the cost of storge routinely vary by
-one milllion or more.  Consider the following computations
+where both the cost of recomputation and the cost of storage routinely vary by
+one million or more.  Consider the following computations
 
 ```python
 # Want this
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cachey-0.1.1/cachey/__init__.py 
new/cachey-0.2.1/cachey/__init__.py
--- old/cachey-0.1.1/cachey/__init__.py 2015-08-14 20:42:49.000000000 +0200
+++ new/cachey-0.2.1/cachey/__init__.py 2020-03-11 16:33:15.000000000 +0100
@@ -2,4 +2,4 @@
 from .cache import Cache
 from .nbytes import nbytes
 
-__version__ = '0.1.1'
+__version__ = '0.2.1'
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cachey-0.1.1/cachey/cache.py 
new/cachey-0.2.1/cachey/cache.py
--- old/cachey-0.1.1/cachey/cache.py    2015-08-11 18:24:49.000000000 +0200
+++ new/cachey-0.2.1/cachey/cache.py    2020-03-11 16:28:41.000000000 +0100
@@ -8,7 +8,7 @@
 
 
 def memo_key(args, kwargs):
-    result = (args, frozenset(kwargs.items()))
+    result = (args, frozenset(list(kwargs.items())))
     try:
         hash(result)
     except TypeError:
@@ -41,6 +41,9 @@
         Function to compute the number of bytes of an input.
     cost:  function  (defaults to cost())
         Determine cost from nbytes and time
+    cache_data : MutableMapping (defaults to dict())
+        Dict-like object to use for cache
+
 
     Example
     -------
@@ -57,7 +60,8 @@
     >>> memo_inc = c.memoize(inc)  # Memoize functions
     """
     def __init__(self, available_bytes, limit=0, scorer=None, halflife=1000,
-                 nbytes=nbytes, cost=cost, hit=None, miss=None):
+                 nbytes=nbytes, cost=cost, hit=None, miss=None,
+                 cache_data=None):
         if scorer is None:
             scorer = Scorer(halflife)
         self.scorer = scorer
@@ -68,7 +72,7 @@
         self.hit = hit
         self.miss = miss
 
-        self.data = dict()
+        self.data = cache_data if cache_data is not None else dict()
         self.heap = heapdict()
         self.nbytes = dict()
         self.total_bytes = 0
@@ -123,9 +127,21 @@
         self.total_bytes -= self.nbytes.pop(key)
 
     def _shrink_one(self):
-        key, score = self.heap.popitem()
+        try:
+            key, score = self.heap.popitem()
+        except IndexError:
+            return
         self.retire(key)
-
+        
+        
+    def resize(self, available_bytes):
+        """ Resize the cache. 
+            
+            Will fit the cache into available_bytes by calling `shrink()`.
+        """
+        self.available_bytes = available_bytes
+        self.shrink()
+        
     def shrink(self):
         """ Retire keys from the cache until we're under bytes budget
 
@@ -142,10 +158,10 @@
         return key in self.data
 
     def clear(self):
-        while self:
+        while self.data:
             self._shrink_one()
 
-    def __nonzero__(self):
+    def __bool__(self):
         return not not self.data
 
     def memoize(self, func, key=memo_key):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cachey-0.1.1/cachey/tests/test_cache.py 
new/cachey-0.2.1/cachey/tests/test_cache.py
--- old/cachey-0.1.1/cachey/tests/test_cache.py 2015-08-11 17:54:42.000000000 
+0200
+++ new/cachey-0.2.1/cachey/tests/test_cache.py 2020-03-11 16:28:41.000000000 
+0100
@@ -24,6 +24,40 @@
     assert not c.heap
 
 
+def test_cache_data_dict():
+
+    my_dict = {}
+    c = Cache(available_bytes=nbytes(1) * 3, cache_data=my_dict)
+    c.put('x', 1, 10)
+    assert c.get('x') == 1
+    assert my_dict['x'] == 1
+    c.clear()
+    assert 'x' not in c
+
+
+def test_cache_resize():
+    c = Cache(available_bytes=nbytes(1) * 3)
+
+    c.put('x', 1, 10)
+    assert c.get('x') == 1
+    assert 'x' in c
+
+    c.put('a', 1, 10)
+    c.put('b', 1, 10)
+    c.put('c', 1, 10)
+    assert set(c.data) == set('xbc')
+    c.put('d', 1, 10)
+    assert set(c.data) == set('xcd')
+
+    # resize will shrink
+    c.resize(available_bytes=nbytes(1) * 1)
+
+    assert set(c.data) == set('x')
+
+    c.resize(available_bytes=nbytes(1) * 10)
+
+    assert set(c.data) == set('x')
+
 def test_cache_scores_update():
     c = Cache(available_bytes=nbytes(1) * 2)
     c.put('x', 1, 1)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cachey-0.1.1/cachey.egg-info/PKG-INFO 
new/cachey-0.2.1/cachey.egg-info/PKG-INFO
--- old/cachey-0.1.1/cachey.egg-info/PKG-INFO   2015-08-14 20:44:25.000000000 
+0200
+++ new/cachey-0.2.1/cachey.egg-info/PKG-INFO   2020-03-11 16:33:42.000000000 
+0100
@@ -1,10 +1,10 @@
-Metadata-Version: 1.0
+Metadata-Version: 2.1
 Name: cachey
-Version: 0.1.1
+Version: 0.2.1
 Summary: Caching mindful of computation/storage costs
-Home-page: http://github.com/mrocklin/cachey/
-Author: Matthew Rocklin
-Author-email: mrock...@gmail.com
+Home-page: http://github.com/dask/cachey/
+Maintainer: Matthew Rocklin
+Maintainer-email: mrock...@gmail.com
 License: BSD
 Description: Caching for Analytic Computations
         ---------------------------------
@@ -12,8 +12,8 @@
         Humans repeat stuff.  Caching helps.
         
         Normal caching policies like LRU aren't well suited for analytic 
computations
-        where both the cost of recomputation and the cost of storge routinely 
vary by
-        one milllion or more.  Consider the following computations
+        where both the cost of recomputation and the cost of storage routinely 
vary by
+        one million or more.  Consider the following computations
         
         ```python
         # Want this
@@ -64,3 +64,15 @@
         Cachey is new and not robust.
         
 Platform: UNKNOWN
+Classifier: Development Status :: 4 - Beta
+Classifier: Intended Audience :: Developers
+Classifier: Intended Audience :: Science/Research
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Operating System :: OS Independent
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.6
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Topic :: Scientific/Engineering
+Requires-Python: >=3.6
+Description-Content-Type: text/markdown
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cachey-0.1.1/setup.cfg new/cachey-0.2.1/setup.cfg
--- old/cachey-0.1.1/setup.cfg  2015-08-14 20:44:29.000000000 +0200
+++ new/cachey-0.2.1/setup.cfg  2020-03-11 16:33:42.000000000 +0100
@@ -1,5 +1,4 @@
 [egg_info]
 tag_build = 
 tag_date = 0
-tag_svn_revision = 0
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/cachey-0.1.1/setup.py new/cachey-0.2.1/setup.py
--- old/cachey-0.1.1/setup.py   2015-08-14 20:42:47.000000000 +0200
+++ new/cachey-0.2.1/setup.py   2020-03-11 16:33:04.000000000 +0100
@@ -3,16 +3,32 @@
 from os.path import exists
 from setuptools import setup
 
+
+
 setup(name='cachey',
-      version='0.1.1',
+      version='0.2.1',
       description='Caching mindful of computation/storage costs',
-      url='http://github.com/mrocklin/cachey/',
+      classifiers=[
+        "Development Status :: 4 - Beta",
+        "Intended Audience :: Developers",
+        "Intended Audience :: Science/Research",
+        "License :: OSI Approved :: BSD License",
+        "Operating System :: OS Independent",
+        "Programming Language :: Python :: 3",
+        "Programming Language :: Python :: 3.6",
+        "Programming Language :: Python :: 3.7",
+        "Programming Language :: Python :: 3.8",
+        "Topic :: Scientific/Engineering",
+    ],
+      url='http://github.com/dask/cachey/',
       maintainer='Matthew Rocklin',
       maintainer_email='mrock...@gmail.com',
       license='BSD',
       keywords='',
       packages=['cachey'],
+      python_requires='>=3.6',
       
install_requires=list(open('requirements.txt').read().strip().split('\n')),
       long_description=(open('README.md').read() if exists('README.md')
                         else ''),
+      long_description_content_type='text/markdown',
       zip_safe=False)


Reply via email to