https://git.altlinux.org/tasks/410887/logs/events.1.1.log
https://packages.altlinux.org/tasks/410887

subtask  name       aarch64  i586  x86_64
   #100  llama.cpp     7:03     -    5:18

2026-Mar-11 05:55:40 :: test-only task #410887 for sisyphus started by vt:
#100 build 8260-alt1 from /people/vt/packages/llama.cpp.git fetched at 
2026-Mar-11 05:55:38
2026-Mar-11 05:55:41 :: [aarch64] #100 llama.cpp.git 8260-alt1: build start
2026-Mar-11 05:55:41 :: [x86_64] #100 llama.cpp.git 8260-alt1: build start
2026-Mar-11 05:55:41 :: [i586] #100 llama.cpp.git 8260-alt1: build start
2026-Mar-11 05:55:48 :: [i586] #100 llama.cpp.git 8260-alt1: build SKIPPED
build/100/x86_64/log:[00:02:22] debuginfo.req: WARNING: 
/usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:02:22] debuginfo.req: WARNING: 
/usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Mar-11 06:00:59 :: [x86_64] #100 llama.cpp.git 8260-alt1: build OK
2026-Mar-11 06:02:44 :: [aarch64] #100 llama.cpp.git 8260-alt1: build OK
2026-Mar-11 06:02:52 :: 100: build check OK
2026-Mar-11 06:02:53 :: build check OK
2026-Mar-11 06:03:08 :: #100: llama.cpp.git 8260-alt1: version check OK
2026-Mar-11 06:03:08 :: build version check OK
--- llama.cpp-cpu-8260-alt1.x86_64.rpm.share    2026-03-11 06:03:11.869059412 
+0000
+++ llama.cpp-cpu-8260-alt1.aarch64.rpm.share   2026-03-11 06:03:13.260071193 
+0000
@@ -8,3 +8,3 @@
 /usr/share/doc/llama.cpp/README.md     100644  UTF-8 Unicode English text, 
with very long lines
-/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text, 
with very long lines
+/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text
 /usr/share/doc/llama.cpp/docs  40755   directory
warning (#100): non-identical /usr/share part
2026-Mar-11 06:03:31 :: noarch check OK
2026-Mar-11 06:03:33 :: plan: src +1 -1 =21708, aarch64 +8 -8 =38520, x86_64 
+10 -10 =39535
#100 llama.cpp 8192-alt1 -> 1:8260-alt1
 Wed Mar 11 2026 Vitaly Chikunov <vt@altlinux> 1:8260-alt1
 - Update to b8260 (2026-03-09).
2026-Mar-11 06:04:25 :: patched apt indices
2026-Mar-11 06:04:37 :: created next repo
2026-Mar-11 06:04:49 :: duplicate provides check OK
2026-Mar-11 06:05:34 :: dependencies check OK
2026-Mar-11 06:06:17 :: [x86_64 aarch64] ELF symbols check OK
2026-Mar-11 06:06:28 :: [x86_64] #100 libllama: install check OK
2026-Mar-11 06:06:33 :: [x86_64] #100 libllama-debuginfo: install check OK
        x86_64: libllama-devel=1:8260-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-11 06:06:38 :: [x86_64] #100 libllama-devel: install check OK
2026-Mar-11 06:06:38 :: [aarch64] #100 libllama: install check OK
2026-Mar-11 06:06:50 :: [aarch64] #100 libllama-debuginfo: install check OK
2026-Mar-11 06:06:59 :: [x86_64] #100 llama.cpp: install check OK
        aarch64: libllama-devel=1:8260-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-11 06:07:01 :: [aarch64] #100 libllama-devel: install check OK
2026-Mar-11 06:07:09 :: [x86_64] #100 llama.cpp-cpu: install check OK
2026-Mar-11 06:07:14 :: [aarch64] #100 llama.cpp: install check OK
2026-Mar-11 06:07:26 :: [aarch64] #100 llama.cpp-cpu: install check OK
2026-Mar-11 06:07:26 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Mar-11 06:07:43 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Mar-11 06:07:56 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2026-Mar-11 06:07:56 :: [x86_64] #100 llama.cpp-cuda: install check OK
2026-Mar-11 06:08:11 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install 
check OK
2026-Mar-11 06:08:28 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2026-Mar-11 06:08:38 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2026-Mar-11 06:08:48 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check 
OK
2026-Mar-11 06:09:07 :: [x86_64-i586] generated apt indices
2026-Mar-11 06:09:07 :: [x86_64-i586] created next repo
2026-Mar-11 06:09:20 :: [x86_64-i586] dependencies check OK
2026-Mar-11 06:09:20 :: gears inheritance check OK
2026-Mar-11 06:09:21 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2026-Mar-11 06:09:21 :: acl check OK
2026-Mar-11 06:09:35 :: created contents_index files
2026-Mar-11 06:09:45 :: created hash files: aarch64 src x86_64
2026-Mar-11 06:09:49 :: task #410887 for sisyphus TESTED
_______________________________________________
Sisyphus-incominger mailing list
[email protected]
https://lists.altlinux.org/mailman/listinfo/sisyphus-incominger

Reply via email to