Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package llamacpp for openSUSE:Factory 
checked in at 2025-05-20 12:19:52
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/llamacpp (Old)
 and      /work/SRC/openSUSE:Factory/.llamacpp.new.30101 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "llamacpp"

Tue May 20 12:19:52 2025 rev:8 rq:1278459 version:5426

Changes:
--------
--- /work/SRC/openSUSE:Factory/llamacpp/llamacpp.changes        2025-05-09 
18:54:00.451691188 +0200
+++ /work/SRC/openSUSE:Factory/.llamacpp.new.30101/llamacpp.changes     
2025-05-20 12:19:58.581773369 +0200
@@ -1,0 +2,24 @@
+Mon May 19 20:03:14 UTC 2025 - Eyad Issa <eyadlore...@gmail.com>
+
+- Update to 5426:
+  * print hint when loading a model when no backends are loaded
+  * vulkan: use scalar FA rather than coopmat2 when N==1
+  * mtmd : add vision support for llama 4
+  * Full changelog:
+    https://github.com/ggml-org/llama.cpp/compare/b5402...b5426
+
+-------------------------------------------------------------------
+Fri May 16 14:17:52 UTC 2025 - Robert Munteanu <romb...@apache.org>
+
+- Update to 5402
+  * removed llava subpackage (#13460)
+  * Full changelog:
+    https://github.com/ggml-org/llama.cpp/compare/b5158...b5321
+
+-------------------------------------------------------------------
+Fri May  9 21:15:27 UTC 2025 - Eyad Issa <eyadlore...@gmail.com>
+
+- Update to version 5332:
+  * server : vision support via libmtmd
+
+-------------------------------------------------------------------

Old:
----
  _servicedata
  llamacpp-5321.tar.gz

New:
----
  llamacpp-5426.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ llamacpp.spec ++++++
--- /var/tmp/diff_new_pack.zuPe0L/_old  2025-05-20 12:19:59.273802405 +0200
+++ /var/tmp/diff_new_pack.zuPe0L/_new  2025-05-20 12:19:59.273802405 +0200
@@ -17,7 +17,7 @@
 
 
 Name:           llamacpp
-Version:        5321
+Version:        5426
 Release:        0
 Summary:        llama-cli tool to run inference using the llama.cpp library
 License:        MIT
@@ -160,8 +160,6 @@
 %install
 %cmake_install
 
-# used for shader compilation only
-rm %{buildroot}%{_bindir}/vulkan-shaders-gen
 # dev scripts
 rm %{buildroot}%{_bindir}/convert_hf_to_gguf.py
 
@@ -210,7 +208,3 @@
 %license LICENSE
 %{_libdir}/libmtmd_shared.so
 
-%files -n libllava
-%license LICENSE
-%{_libdir}/libllava_shared.so
-

++++++ llamacpp-5321.tar.gz -> llamacpp-5426.tar.gz ++++++
/work/SRC/openSUSE:Factory/llamacpp/llamacpp-5321.tar.gz 
/work/SRC/openSUSE:Factory/.llamacpp.new.30101/llamacpp-5426.tar.gz differ: 
char 13, line 1

Reply via email to