fix: use llama-cpp-python 0.3.7 #125
Annotations
2 errors
Build image (./chat/base/Containerfile, ai-lab-playground-chat, amd64, arm64)
Error: buildah exited with code 1
Trying to pull registry.access.redhat.com/ubi9-minimal:9.5-1739420147...
Getting image source signatures
Copying blob sha256:5025173ec0b35686a33458b367c2a6e898c824f57a07925c25d26a0cfb5f2e50
Copying blob sha256:3333307dcd2e4279579646a05a5f99082a61a20906175240445b0e15f73b6d6e
Copying config sha256:da8c0ca2c3f40c149598f1a3b5f3987911fb20187b079c677d3ed0aad2e431d0
Writing manifest to image destination
(microdnf:2): librhsm-WARNING **: 17:37:49.886: Found 0 entitlement certificates
(microdnf:2): librhsm-WARNING **: 17:37:49.889: Found 0 entitlement certificates
Created symlink /etc/systemd/system/sockets.target.wants/dbus.socket → /usr/lib/systemd/system/dbus.socket.
Created symlink /etc/systemd/user/sockets.target.wants/dbus.socket → /usr/lib/systemd/user/dbus.socket.
Created symlink /etc/systemd/system/dbus.service → /usr/lib/systemd/system/dbus-broker.service.
Created symlink /etc/systemd/user/dbus.service → /usr/lib/systemd/user/dbus-broker.service.
(microdnf:1): librhsm-WARNING **: 17:38:13.332: Found 0 entitlement certificates
(microdnf:1): librhsm-WARNING **: 17:38:13.335: Found 0 entitlement certificates
ERROR: Command errored out with exit status 1:
command: /usr/bin/python3 /usr/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmpb9ilp3t3
cwd: /tmp/pip-install-gy8o7nll/llama-cpp-python_a484dc9772b743429b53443580e13b6b
Complete output (40 lines):
*** scikit-build-core 0.10.7 using CMake 3.31.4 (wheel)
*** Configuring CMake...
loading initial cache file /tmp/tmpdy1ivoj_/build/CMakeInit.txt
-- The C compiler identification is GNU 11.5.0
-- The CXX compiler identification is GNU 11.5.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/g++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Could NOT find Git (missing: GIT_EXECUTABLE)
CMake Warning at vendor/llama.cpp/cmake/build-info.cmake:14 (message):
Git not found. Build info will not be accurate.
Call Stack (most recent call first):
vendor/llama.cpp/CMakeLists.txt:85 (include)
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: x86_64
-- Including CPU backend
-- Found OpenMP_C: -fopenmp (found version "4.5")
-- Found OpenMP_CXX: -fopenmp (found version "4.5")
-- Found OpenMP: TRUE (found version "4.5")
-- x86 detected
-- Adding CPU backend variant ggml-cpu: -march=native
CMake Error at vendor/llama.cpp/ggml/CMakeLists.txt:277 (find_program):
Could not find GIT_EXE using the following names: git, git.exe
-- Configuring incomplete, errors occurred!
*** CMake configuration failed
----------------------------------------
ERROR: Failed building wheel for llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
Error: building at STEP "RUN pip install --no-cache-dir --upgrade -r requirements.txt": while running runtime: exit status 1
|
Build image (./chat/vulkan/arm64/Containerfile, ai-lab-playground-chat-vulkan, arm64)
Error: buildah exited with code 1
Trying to pull registry.access.redhat.com/ubi9/python-311:9.5-1737537151...
Getting image source signatures
Copying blob sha256:17f7af7a37d4b6da17d2725f33537953d09fe9cf30df676b1d1dd561e35971ab
Copying blob sha256:5dd38a596bb444ae49b733ab06ccea100da012ff5a59aae67fa5c6d5d2599092
Copying blob sha256:455f1f328896ce3196f2e69e7166cf25f08552be4e0acb025a4b1e8f457e0dea
Copying blob sha256:3fc3ce98128d7b8cf9476e1485ed58dcd7cf72dc07ea0597e8a8e09412ce530c
Copying blob sha256:f999dbdae714d45c5dfdb9663ce8a4dacdeb39839c23fbdee19edd1ce2645e53
Copying blob sha256:6fa5b5438884b2920c9635bf7c0f41f9fc4c743676d5f55d28833d60574e270d
Copying blob sha256:9941b1ecd2639396c7340d40066bada86011af8281035a17ec0f2aee4a5705af
Copying blob sha256:809859c1ad72a0ead98feee12de858779e889215d07bab07e5586db9654c4345
Copying config sha256:2d596928f7c79fd66ef821e65fd2be42396cdf134517bd796313ad86d53ff652
Writing manifest to image destination
Enabling a Copr repository. Please note that this repository is not part
of the main distribution, and quality may vary.
The Fedora Project does not exercise any power over the contents of
this repository beyond the rules outlined in the Copr FAQ at
<https://docs.pagure.org/copr.copr/user_documentation.html#what-i-can-build-in-copr>,
and packages are not held to any quality or security level.
Please do not file bug reports about these packages in Fedora
Bugzilla. In case of problems, contact the owner of this repository.
Importing GPG key 0x7BA6947F:
Userid : "slp_mesa-krunkit (None) <slp#[email protected]>"
Fingerprint: C962 C887 AE35 6588 B601 6773 6E54 C94F 7BA6 947F
From : https://download.copr.fedorainfracloud.org/results/slp/mesa-krunkit/pubkey.gpg
Enabling a Copr repository. Please note that this repository is not part
of the main distribution, and quality may vary.
The Fedora Project does not exercise any power over the contents of
this repository beyond the rules outlined in the Copr FAQ at
<https://docs.pagure.org/copr.copr/user_documentation.html#what-i-can-build-in-copr>,
and packages are not held to any quality or security level.
Please do not file bug reports about these packages in Fedora
Bugzilla. In case of problems, contact the owner of this repository.
Importing GPG key 0x1274A9E9:
Userid : "jeffmaury_shaderc (None) <jeffmaury#[email protected]>"
Fingerprint: 0039 3D1A 5866 7855 3C65 64EC D813 4F39 1274 A9E9
From : https://download.copr.fedorainfracloud.org/results/jeffmaury/shaderc/pubkey.gpg
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [96 lines of output]
*** scikit-build-core 0.10.7 using CMake 3.26.5 (wheel)
*** Configuring CMake...
loading initial cache file /tmp/tmpgm04yxso/build/CMakeInit.txt
-- The C compiler identification is GNU 11.5.0
-- The CXX compiler identification is GNU 11.5.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/g++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found Git: /usr/bin/git (found version "2.43.5")
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: aarch64
-- Including CPU backend
-- Found OpenMP_C: -fopenmp (found version "4.5")
-- Found OpenMP_CXX: -fopenmp (found version "4.5")
-- Found OpenMP: TRUE (found version "4.5")
-- ARM detected
-- Performing Test GGML_COMPILER_SUPPORTS_FP16_FORMAT_I3
|