Script 'mail_helper' called by obssrc
Hello community,
here is the log from the commit of package python-cotengra for openSUSE:Factory
checked in at 2026-03-27 16:50:32
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-cotengra (Old)
and /work/SRC/openSUSE:Factory/.python-cotengra.new.8177 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "python-cotengra"
Fri Mar 27 16:50:32 2026 rev:4 rq:1343077 version:0.7.5
Changes:
--------
--- /work/SRC/openSUSE:Factory/python-cotengra/python-cotengra.changes
2024-05-29 19:36:06.165033723 +0200
+++
/work/SRC/openSUSE:Factory/.python-cotengra.new.8177/python-cotengra.changes
2026-03-27 16:53:06.000471558 +0100
@@ -1,0 +2,52 @@
+Fri Mar 27 01:33:48 UTC 2026 - Steve Kowalik <[email protected]>
+
+- Update to 0.7.5:
+ ## Enhancements
+ * Add optimize="edgesort" (aliased to optimize="ncon" too), which performs
+ a contraction by contracting edges in sorted order, thus can be entirely
+ specified by the graph labelling.
+ * Add edge_path_to_ssa and edge_path_to_linear for converting edge paths to
+ SSA and linear paths respectively.
+ * ContractionTree.from_path: allow an edge_path argument.
+ * Allow manual path specification as edge path
+ * ReusableHyperOptimizer and DiskDict, allow splitting key into
+ subdirectory structure for better performance
+ * High level interface functions accept the strip_exponent kwarg, which
+ eagerly strips a scaling exponent (log10) as the contraction proceeds,
+ avoiding issues to do with very large or very small numeric values.
+ * Add cmaes as an optlib method, use it by default for 'auto' preset if
+ available since ih has less overhead than optuna.
+ * Add HyperOptimizer.plot_parameters_parallel for plotting the sampled
+ parameter space of a hyper optimizer method in parallel coordinates.
+ * Add ncon interface.
+ * Add utils.save_to_json and utils.load_from_json for saving and loading
+ contractions to/from json.
+ * Add examples/benchmarks with various json benchmark contractions
+ * Add utils.networkx_graph_to_equation for converting a networkx graph to
+ cotengra style inputs, output and size_dict.
+ * Add "max" as a valid minimize option for optimize_optimal (also added to
+ cotengrust), which minimizes the single most expensive contraction (i.e.
+ the cost scaling)
+ * Add RandomOptimizer, a fully random optimizer for testing and
+ initialization purposes. It can be used with optimize="random" but is not
+ recommended for actual optimization.
+ * Add PathOptimizer to top-level namespace.
+ * ContractTreeCompressed.from_path: add the autocomplete option
+ * Add option overwrite="improved" to reusable hyper optimizers, which
+ always searches but only overwrites if the new tree is better, allowing
+ easy incremental refining of a collection of trees.
+ * einsum via bmm (implementation="cotengra") avoids using einsum for
+ transposing inputs.
+ ## Bug fixes
+ * ContractionTree.print_contractions: fix show_brackets option, show
+ preprocessing steps with original inputs indices.
+ * Fix and add edge case test for optimize=()
+ * When contracting with slices and strip_exponent enabled, each slice
+ result is returned with the exponent separately, rather than matching
+ the first, these are are now combined in gather_slices.
+ * Fix HyperGraph.plot when nodes are not labelled as consecutive integers
+ * Fix ContractionTreeCompressed.windowed_reconfigure not propagating the
+ default objective
+ * Fix kahypar path optimization when no edges are present
+
+-------------------------------------------------------------------
Old:
----
cotengra-0.6.2.tar.gz
New:
----
cotengra-0.7.5.tar.gz
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Other differences:
------------------
++++++ python-cotengra.spec ++++++
--- /var/tmp/diff_new_pack.NVtW9M/_old 2026-03-27 16:53:06.516493159 +0100
+++ /var/tmp/diff_new_pack.NVtW9M/_new 2026-03-27 16:53:06.516493159 +0100
@@ -1,7 +1,7 @@
#
# spec file for package python-cotengra
#
-# Copyright (c) 2024 SUSE LLC
+# Copyright (c) 2026 SUSE LLC and contributors
#
# All modifications and additions to the file contributed by third parties
# remain the property of their copyright owners, unless otherwise agreed
@@ -17,16 +17,16 @@
Name: python-cotengra
-Version: 0.6.2
+Version: 0.7.5
Release: 0
Summary: Hyper optimized contraction trees for large tensor networks
and einsums
License: Apache-2.0
URL: https://github.com/jcmgray/cotengra
Source:
https://files.pythonhosted.org/packages/source/c/cotengra/cotengra-%{version}.tar.gz
+BuildRequires: %{python_module base >= 3.8}
+BuildRequires: %{python_module hatch_vcs}
+BuildRequires: %{python_module hatchling}
BuildRequires: %{python_module pip}
-BuildRequires: %{python_module setuptools >= 45}
-BuildRequires: %{python_module setuptools_scm >= 6.2}
-BuildRequires: %{python_module wheel}
BuildRequires: fdupes
BuildRequires: python-rpm-macros
Requires: python-autoray
++++++ cotengra-0.6.2.tar.gz -> cotengra-0.7.5.tar.gz ++++++
++++ 117053 lines of diff (skipped)