https://git.altlinux.org/tasks/archive/done/_400/409907/logs/events.2.1.log
https://packages.altlinux.org/tasks/409907

2026-Mar-03 23:30:55 :: task #409907 for sisyphus resumed by vt:
#100 build 8192-alt1 from /people/vt/packages/llama.cpp.git fetched at 
2026-Mar-03 22:36:39
2026-Mar-03 23:30:56 :: [i586] #100 llama.cpp.git 8192-alt1: build start
2026-Mar-03 23:30:56 :: [aarch64] #100 llama.cpp.git 8192-alt1: build start
2026-Mar-03 23:30:56 :: [x86_64] #100 llama.cpp.git 8192-alt1: build start
2026-Mar-03 23:31:03 :: [i586] #100 llama.cpp.git 8192-alt1: build SKIPPED
build/100/x86_64/log:[00:02:14] debuginfo.req: WARNING: 
/usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:02:14] debuginfo.req: WARNING: 
/usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Mar-03 23:31:11 :: [x86_64] #100 llama.cpp.git 8192-alt1: build OK (cached)
2026-Mar-03 23:31:12 :: [aarch64] #100 llama.cpp.git 8192-alt1: build OK 
(cached)
2026-Mar-03 23:31:12 :: 100: build check OK (cached)
2026-Mar-03 23:31:13 :: build check OK
2026-Mar-03 23:31:25 :: #100: llama.cpp.git 8192-alt1: version check OK
2026-Mar-03 23:31:26 :: build version check OK
--- llama.cpp-cpu-8192-alt1.x86_64.rpm.share    2026-03-03 23:31:29.149058513 
+0000
+++ llama.cpp-cpu-8192-alt1.aarch64.rpm.share   2026-03-03 23:31:30.208068112 
+0000
@@ -8,3 +8,3 @@
 /usr/share/doc/llama.cpp/README.md     100644  UTF-8 Unicode English text, 
with very long lines
-/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text, 
with very long lines
+/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text
 /usr/share/doc/llama.cpp/docs  40755   directory
warning (#100): non-identical /usr/share part
2026-Mar-03 23:31:43 :: noarch check OK
2026-Mar-03 23:31:45 :: plan: src +1 -1 =21664, aarch64 +8 -8 =38437, x86_64 
+10 -10 =39454
#100 llama.cpp 8018-alt1 -> 1:8192-alt1
 Tue Mar 03 2026 Vitaly Chikunov <vt@altlinux> 1:8192-alt1
 - Update to b8192 (2026-03-03).
2026-Mar-03 23:32:26 :: patched apt indices
2026-Mar-03 23:32:34 :: created next repo
2026-Mar-03 23:32:45 :: duplicate provides check OK
2026-Mar-03 23:33:23 :: dependencies check OK
2026-Mar-03 23:33:55 :: [x86_64 aarch64] ELF symbols check OK
2026-Mar-03 23:34:04 :: [x86_64] #100 libllama: install check OK (cached)
2026-Mar-03 23:34:07 :: [x86_64] #100 libllama-debuginfo: install check OK 
(cached)
        x86_64: libllama-devel=1:8192-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-03 23:34:10 :: [x86_64] #100 libllama-devel: install check OK (cached)
2026-Mar-03 23:34:11 :: [aarch64] #100 libllama: install check OK (cached)
2026-Mar-03 23:34:13 :: [x86_64] #100 llama.cpp: install check OK (cached)
2026-Mar-03 23:34:16 :: [x86_64] #100 llama.cpp-cpu: install check OK (cached)
2026-Mar-03 23:34:17 :: [aarch64] #100 libllama-debuginfo: install check OK 
(cached)
2026-Mar-03 23:34:19 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK 
(cached)
2026-Mar-03 23:34:23 :: [x86_64] #100 llama.cpp-cuda: install check OK (cached)
        aarch64: libllama-devel=1:8192-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-03 23:34:23 :: [aarch64] #100 libllama-devel: install check OK (cached)
2026-Mar-03 23:34:27 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check 
OK (cached)
2026-Mar-03 23:34:30 :: [aarch64] #100 llama.cpp: install check OK (cached)
2026-Mar-03 23:34:30 :: [x86_64] #100 llama.cpp-vulkan: install check OK 
(cached)
2026-Mar-03 23:34:33 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check 
OK (cached)
2026-Mar-03 23:34:36 :: [aarch64] #100 llama.cpp-cpu: install check OK (cached)
2026-Mar-03 23:34:42 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check 
OK (cached)
2026-Mar-03 23:34:48 :: [aarch64] #100 llama.cpp-vulkan: install check OK 
(cached)
2026-Mar-03 23:34:55 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install 
check OK (cached)
2026-Mar-03 23:35:12 :: [x86_64-i586] generated apt indices
2026-Mar-03 23:35:12 :: [x86_64-i586] created next repo
2026-Mar-03 23:35:22 :: [x86_64-i586] dependencies check OK
2026-Mar-03 23:35:23 :: gears inheritance check OK
2026-Mar-03 23:35:23 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2026-Mar-03 23:35:24 :: acl check OK
2026-Mar-03 23:35:36 :: created contents_index files
2026-Mar-03 23:35:44 :: created hash files: aarch64 src x86_64
2026-Mar-03 23:35:46 :: task #409907 for sisyphus TESTED
2026-Mar-03 23:35:47 :: task is ready for commit
2026-Mar-03 23:35:52 :: repo clone OK
2026-Mar-03 23:35:52 :: packages update OK
2026-Mar-03 23:35:58 :: [x86_64 aarch64] update OK
2026-Mar-03 23:35:58 :: repo update OK
2026-Mar-03 23:36:09 :: repo save OK
2026-Mar-03 23:36:09 :: src index update OK
2026-Mar-03 23:36:11 :: updated /gears/l/llama.cpp.git branch `sisyphus'
2026-Mar-03 23:36:30 :: gears update OK
2026-Mar-03 23:36:30 :: task #409907 for sisyphus DONE
_______________________________________________
Sisyphus-incominger mailing list
[email protected]
https://lists.altlinux.org/mailman/listinfo/sisyphus-incominger

Reply via email to