Script 'mail_helper' called by obssrc Hello community, here is the log from the commit of package ollama for openSUSE:Factory checked in at 2025-11-24 14:10:47 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Comparing /work/SRC/openSUSE:Factory/ollama (Old) and /work/SRC/openSUSE:Factory/.ollama.new.14147 (New) ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "ollama" Mon Nov 24 14:10:47 2025 rev:50 rq:1319255 version:0.13.0 Changes: -------- --- /work/SRC/openSUSE:Factory/ollama/ollama.changes 2025-11-11 19:23:21.844567657 +0100 +++ /work/SRC/openSUSE:Factory/.ollama.new.14147/ollama.changes 2025-11-24 14:13:21.971811774 +0100 @@ -1,0 +2,33 @@ +Sat Nov 22 04:14:47 UTC 2025 - Glen Masgai <[email protected]> + +- Update to version 0.13.0 + * New models: DeepSeek-OCR, Cogito-V2.1 + * DeepSeek-V3.1 architecture is now supported in Ollama's engine + * Fixed performance issues that arose in Ollama 0.12.11 on CUDA + * Fixed issue where Linux install packages were missing required + Vulkan libraries + * Improved CPU and memory detection while in containers/cgroups + * Improved VRAM information detection for AMD GPUs + * Improved KV cache performance to no longer require + defragmentation + +- Update to version 0.12.11 + * Ollama's API and the OpenAI-compatible API now supports + Logprobs, see: + https://cookbook.openai.com/examples/using_logprobs) and + https://github.com/ollama/ollama/releases/tag/v0.12.11 + * Ollama's new app now supports WebP images + * Improved rendering performance in Ollama's new app, especially + when rendering code + * The "required" field in tool definitions will now be omitted if + not specified + * Fixed issue where "tool_call_id" would be omitted when using + the OpenAI-compatible API. + * Fixed issue where ollama create would import data from both + consolidated.safetensors and other safetensor files. + * Ollama will now prefer dedicated GPUs over iGPUs when + scheduling models + * Vulkan can now be enabled by setting OLLAMA_VULKAN=1. + For example: OLLAMA_VULKAN=1 ollama serve + +------------------------------------------------------------------- @@ -27,0 +61 @@ + Old: ---- ollama-0.12.10.tar.gz New: ---- ollama-0.13.0.tar.gz ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Other differences: ------------------ ++++++ ollama.spec ++++++ --- /var/tmp/diff_new_pack.sfcZBK/_old 2025-11-24 14:13:24.611922983 +0100 +++ /var/tmp/diff_new_pack.sfcZBK/_new 2025-11-24 14:13:24.631923825 +0100 @@ -35,7 +35,7 @@ %define cuda_version %{cuda_version_major}-%{cuda_version_minor} Name: ollama -Version: 0.12.10 +Version: 0.13.0 Release: 0 Summary: Tool for running AI models on-premise License: MIT ++++++ build.specials.obscpio ++++++ diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/.gitignore new/.gitignore --- old/.gitignore 1970-01-01 01:00:00.000000000 +0100 +++ new/.gitignore 2025-11-11 11:44:10.000000000 +0100 @@ -0,0 +1,4 @@ +*.obscpio +*.osc +_build.* +.pbuild ++++++ ollama-0.12.10.tar.gz -> ollama-0.13.0.tar.gz ++++++ /work/SRC/openSUSE:Factory/ollama/ollama-0.12.10.tar.gz /work/SRC/openSUSE:Factory/.ollama.new.14147/ollama-0.13.0.tar.gz differ: char 12, line 1
