https://git.altlinux.org/tasks/350238/logs/events.5.1.log

subtask  name       aarch64  i586  ppc64le  x86_64
   #500  llama.cpp     6:20     -        -    3:19

2024-Jun-03 11:42:30 :: test-only task #350238 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 build 0-alt1 from /people/vt/packages/tinyllamas.git fetched at 
2024-Jun-03 10:21:51
#400 removed
#500 build 3072-alt1.20240603 from /people/vt/packages/llama.cpp.git fetched at 
2024-Jun-03 11:42:28
2024-Jun-03 11:42:30 :: created build repo
2024-Jun-03 11:42:32 :: [i586] #300 tinyllamas.git 0-alt1: build start
2024-Jun-03 11:42:32 :: [aarch64] #300 tinyllamas.git 0-alt1: build start
2024-Jun-03 11:42:32 :: [ppc64le] #300 tinyllamas.git 0-alt1: build start
2024-Jun-03 11:42:32 :: [x86_64] #300 tinyllamas.git 0-alt1: build start
2024-Jun-03 11:42:49 :: [i586] #300 tinyllamas.git 0-alt1: build OK (cached)
2024-Jun-03 11:42:50 :: [i586] #500 llama.cpp.git 3072-alt1.20240603: build 
start
2024-Jun-03 11:42:50 :: [x86_64] #300 tinyllamas.git 0-alt1: build OK (cached)
2024-Jun-03 11:42:50 :: [x86_64] #500 llama.cpp.git 3072-alt1.20240603: build 
start
2024-Jun-03 11:43:03 :: [i586] #500 llama.cpp.git 3072-alt1.20240603: build 
SKIPPED
2024-Jun-03 11:43:05 :: [aarch64] #300 tinyllamas.git 0-alt1: build OK (cached)
2024-Jun-03 11:43:06 :: [aarch64] #500 llama.cpp.git 3072-alt1.20240603: build 
start
2024-Jun-03 11:43:13 :: [ppc64le] #300 tinyllamas.git 0-alt1: build OK (cached)
2024-Jun-03 11:43:13 :: [ppc64le] #500 llama.cpp.git 3072-alt1.20240603: build 
start
2024-Jun-03 11:43:38 :: [ppc64le] #500 llama.cpp.git 3072-alt1.20240603: build 
SKIPPED
2024-Jun-03 11:46:09 :: [x86_64] #500 llama.cpp.git 3072-alt1.20240603: build OK
2024-Jun-03 11:49:26 :: [aarch64] #500 llama.cpp.git 3072-alt1.20240603: build 
OK
2024-Jun-03 11:49:35 :: #300: tinyllamas.git 0-alt1: build check OK
2024-Jun-03 11:49:45 :: #500: llama.cpp.git 3072-alt1.20240603: build check OK
2024-Jun-03 11:49:47 :: build check OK
2024-Jun-03 11:50:12 :: noarch check OK
2024-Jun-03 11:50:14 :: plan: src +2 -1 =19361, aarch64 +2 -2 =32841, noarch +1 
-0 =20160, x86_64 +2 -2 =33683
#500 llama.cpp 20240527-alt1 -> 1:3072-alt1.20240603
 Mon Jun 03 2024 Vitaly Chikunov <vt@altlinux> 1:3072-alt1.20240603
 - Update to b3072 (2024-06-03).
 - The version scheme now matches the upstream build number more closely,
   instead of using the commit date.
 - Build with libcurl and OpenBLAS support.
2024-Jun-03 11:50:56 :: patched apt indices
2024-Jun-03 11:51:07 :: created next repo
2024-Jun-03 11:51:17 :: duplicate provides check OK
2024-Jun-03 11:51:56 :: dependencies check OK
2024-Jun-03 11:52:23 :: [x86_64 aarch64] ELF symbols check OK
2024-Jun-03 11:52:34 :: [i586] #300 tinyllamas-gguf: install check OK (cached)
        x86_64: llama.cpp=1:3072-alt1.20240603 post-install unowned files:
 /usr/lib/systemd/system/llama.service
2024-Jun-03 11:52:41 :: [x86_64] #500 llama.cpp: install check OK
2024-Jun-03 11:52:42 :: [ppc64le] #300 tinyllamas-gguf: install check OK 
(cached)
        aarch64: llama.cpp=1:3072-alt1.20240603 post-install unowned files:
 /usr/lib/systemd/system/llama.service
2024-Jun-03 11:52:50 :: [aarch64] #500 llama.cpp: install check OK
2024-Jun-03 11:53:02 :: [x86_64] #500 llama.cpp-debuginfo: install check OK
2024-Jun-03 11:53:07 :: [x86_64] #300 tinyllamas-gguf: install check OK (cached)
2024-Jun-03 11:53:23 :: [aarch64] #500 llama.cpp-debuginfo: install check OK
2024-Jun-03 11:53:31 :: [aarch64] #300 tinyllamas-gguf: install check OK 
(cached)
2024-Jun-03 11:53:48 :: [x86_64-i586] generated apt indices
2024-Jun-03 11:53:48 :: [x86_64-i586] created next repo
2024-Jun-03 11:53:59 :: [x86_64-i586] dependencies check OK
2024-Jun-03 11:54:01 :: gears inheritance check OK
2024-Jun-03 11:54:01 :: srpm inheritance check OK
girar-check-perms: access to tinyllamas-gguf ALLOWED for vt: project 
`tinyllamas-gguf' is not listed in the acl file for repository `sisyphus', and 
the policy for such projects in `sisyphus' is to allow
check-subtask-perms: #300: tinyllamas-gguf: allowed for vt
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #500: llama.cpp: allowed for vt
2024-Jun-03 11:54:02 :: acl check OK
2024-Jun-03 11:54:23 :: created contents_index files
2024-Jun-03 11:54:36 :: created hash files: aarch64 noarch src x86_64
2024-Jun-03 11:54:40 :: task #350238 for sisyphus TESTED
_______________________________________________
Sisyphus-incominger mailing list
Sisyphus-incominger@lists.altlinux.org
https://lists.altlinux.org/mailman/listinfo/sisyphus-incominger

Reply via email to