Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package ollama for openSUSE:Factory checked 
in at 2026-01-29 17:46:03
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/ollama (Old)
 and      /work/SRC/openSUSE:Factory/.ollama.new.1995 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "ollama"

Thu Jan 29 17:46:03 2026 rev:55 rq:1329736 version:0.15.2

Changes:
--------
--- /work/SRC/openSUSE:Factory/ollama/ollama.changes    2026-01-22 
15:17:37.154829089 +0100
+++ /work/SRC/openSUSE:Factory/.ollama.new.1995/ollama.changes  2026-01-29 
17:48:58.163583841 +0100
@@ -1,0 +2,42 @@
+Wed Jan 28 10:35:02 UTC 2026 - Christian Goll <[email protected]>
+
+- Updated to version 0.15.2:
+  * New ollama launch clawdbot command for launching Clawdbot
+    using Ollama models
+- Updated to version 0.15.1:
+  * GLM-4.7-Flash performance and correctness improvements, fixing
+    repetitive answers and tool calling quality
+  * Fixed performance issues on arm64
+  * Fixed issue where ollama launch would not detect claude and would
+    incorrectly update opencode configurations
+- Updated to version 0.15.0:
+  * New command: ollama launch
+    A new ollama launch command to use Ollama's models with Claude
+    Code, Codex, OpenCode, and Droid without separate configuration.
+    New ollama launch command for Claude Code, Codex, OpenCode, and Droid
+  * Fixed issue where creating multi-line strings with """ would not
+    work when using ollama run
+  * Ctrl+J and Shift+Enter now work for inserting newlines in ollama run
+  * Reduced memory usage for GLM-4.7-Flash models
+- Updated to version 0.14.3:
+  Image generation:
+  * Z-Image Turbo: 6 billion parameter text-to-image model from
+    Alibaba’s Tongyi Lab. It generates high-quality photorealistic
+    images.
+  * Flux.2 Klein: Black Forest Labs’ fastest image-generation models
+    to date.
+  New models:
+  * GLM-4.7-Flash: As the strongest model in the 30B
+    class, GLM-4.7-Flash offers a new option for lightweight
+    deployment that balances performance and efficiency.
+  * LFM2.5-1.2B-Thinking: LFM2.5 is a new family of hybrid models
+    designed for on-device deployment.
+  * Fixed issue where Ollama's macOS app would interrupt system shutdown
+  * Fixed ollama create and ollama show commands for experimental models
+  * The /api/generate API can now be used for image generation
+  * Fixed minor issues in Nemotron-3-Nano tool parsing
+  * Fixed issue where removing an image generation model would cause it to 
first load
+  * Fixed issue where ollama rm would only stop the first model in
+    the list if it were running
+
+-------------------------------------------------------------------

Old:
----
  ollama-0.14.2.tar.gz

New:
----
  ollama-0.15.2.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ ollama.spec ++++++
--- /var/tmp/diff_new_pack.egqJdi/_old  2026-01-29 17:48:59.299632506 +0100
+++ /var/tmp/diff_new_pack.egqJdi/_new  2026-01-29 17:48:59.307632849 +0100
@@ -35,7 +35,7 @@
 %define cuda_version %{cuda_version_major}-%{cuda_version_minor}
 
 Name:           ollama
-Version:        0.14.2
+Version:        0.15.2
 Release:        0
 Summary:        Tool for running AI models on-premise
 License:        MIT

++++++ build.specials.obscpio ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/.gitignore new/.gitignore
--- old/.gitignore      1970-01-01 01:00:00.000000000 +0100
+++ new/.gitignore      2025-11-11 11:44:10.000000000 +0100
@@ -0,0 +1,4 @@
+*.obscpio
+*.osc
+_build.*
+.pbuild

++++++ ollama-0.14.2.tar.gz -> ollama-0.15.2.tar.gz ++++++
/work/SRC/openSUSE:Factory/ollama/ollama-0.14.2.tar.gz 
/work/SRC/openSUSE:Factory/.ollama.new.1995/ollama-0.15.2.tar.gz differ: char 
13, line 1

++++++ vendor.tar.zstd ++++++
Binary files /var/tmp/diff_new_pack.egqJdi/_old and 
/var/tmp/diff_new_pack.egqJdi/_new differ

Reply via email to