https://git.altlinux.org/tasks/410885/logs/events.1.1.log
https://packages.altlinux.org/tasks/410885

subtask  name       aarch64  i586  x86_64
   #100  llama.cpp     6:36     -    5:18

2026-Mar-11 05:18:35 :: test-only task #410885 for sisyphus started by vt:
#100 build 8253-alt1 from /people/vt/packages/llama.cpp.git fetched at 
2026-Mar-11 05:18:33
2026-Mar-11 05:18:37 :: [aarch64] #100 llama.cpp.git 8253-alt1: build start
2026-Mar-11 05:18:37 :: [i586] #100 llama.cpp.git 8253-alt1: build start
2026-Mar-11 05:18:37 :: [x86_64] #100 llama.cpp.git 8253-alt1: build start
2026-Mar-11 05:18:44 :: [i586] #100 llama.cpp.git 8253-alt1: build SKIPPED
build/100/x86_64/log:[00:02:22] debuginfo.req: WARNING: 
/usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:02:22] debuginfo.req: WARNING: 
/usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Mar-11 05:23:55 :: [x86_64] #100 llama.cpp.git 8253-alt1: build OK
2026-Mar-11 05:25:13 :: [aarch64] #100 llama.cpp.git 8253-alt1: build OK
2026-Mar-11 05:25:21 :: 100: build check OK
2026-Mar-11 05:25:22 :: build check OK
2026-Mar-11 05:25:37 :: #100: llama.cpp.git 8253-alt1: version check OK
2026-Mar-11 05:25:37 :: build version check OK
--- llama.cpp-cpu-8253-alt1.x86_64.rpm.share    2026-03-11 05:25:41.335027171 
+0000
+++ llama.cpp-cpu-8253-alt1.aarch64.rpm.share   2026-03-11 05:25:42.867040192 
+0000
@@ -8,3 +8,3 @@
 /usr/share/doc/llama.cpp/README.md     100644  UTF-8 Unicode English text, 
with very long lines
-/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text, 
with very long lines
+/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text
 /usr/share/doc/llama.cpp/docs  40755   directory
warning (#100): non-identical /usr/share part
2026-Mar-11 05:25:59 :: noarch check OK
2026-Mar-11 05:26:01 :: plan: src +1 -1 =21708, aarch64 +8 -8 =38520, x86_64 
+10 -10 =39535
#100 llama.cpp 8192-alt1 -> 1:8253-alt1
 Wed Mar 11 2026 Vitaly Chikunov <vt@altlinux> 1:8253-alt1
 - Update to b8253 (2026-03-09).
2026-Mar-11 05:26:52 :: patched apt indices
2026-Mar-11 05:27:02 :: created next repo
2026-Mar-11 05:27:14 :: duplicate provides check OK
2026-Mar-11 05:28:03 :: dependencies check OK
2026-Mar-11 05:28:46 :: [x86_64 aarch64] ELF symbols check OK
2026-Mar-11 05:28:57 :: [x86_64] #100 libllama: install check OK
2026-Mar-11 05:29:03 :: [x86_64] #100 libllama-debuginfo: install check OK
2026-Mar-11 05:29:06 :: [aarch64] #100 libllama: install check OK
        x86_64: libllama-devel=1:8253-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-11 05:29:07 :: [x86_64] #100 libllama-devel: install check OK
2026-Mar-11 05:29:16 :: [aarch64] #100 libllama-debuginfo: install check OK
2026-Mar-11 05:29:25 :: [x86_64] #100 llama.cpp: install check OK
        aarch64: libllama-devel=1:8253-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-11 05:29:26 :: [aarch64] #100 libllama-devel: install check OK
2026-Mar-11 05:29:31 :: [x86_64] #100 llama.cpp-cpu: install check OK
2026-Mar-11 05:29:37 :: [aarch64] #100 llama.cpp: install check OK
2026-Mar-11 05:29:40 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Mar-11 05:29:48 :: [aarch64] #100 llama.cpp-cpu: install check OK
2026-Mar-11 05:29:58 :: [x86_64] #100 llama.cpp-cuda: install check OK
2026-Mar-11 05:30:04 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Mar-11 05:30:16 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2026-Mar-11 05:30:16 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2026-Mar-11 05:30:22 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2026-Mar-11 05:30:29 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install 
check OK
2026-Mar-11 05:30:30 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check 
OK
2026-Mar-11 05:30:48 :: [x86_64-i586] generated apt indices
2026-Mar-11 05:30:48 :: [x86_64-i586] created next repo
2026-Mar-11 05:31:00 :: [x86_64-i586] dependencies check OK
2026-Mar-11 05:31:00 :: gears inheritance check OK
2026-Mar-11 05:31:01 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2026-Mar-11 05:31:01 :: acl check OK
2026-Mar-11 05:31:15 :: created contents_index files
2026-Mar-11 05:31:25 :: created hash files: aarch64 src x86_64
2026-Mar-11 05:31:28 :: task #410885 for sisyphus TESTED
_______________________________________________
Sisyphus-incominger mailing list
[email protected]
https://lists.altlinux.org/mailman/listinfo/sisyphus-incominger

Reply via email to