fix: use llama-cpp-python 0.3.7 #125
Annotations
1 error
Build Playground Image
Error: buildah exited with code 1
Trying to pull registry.access.redhat.com/ubi9-minimal:9.5-1739420147...
Getting image source signatures
Copying blob sha256:5025173ec0b35686a33458b367c2a6e898c824f57a07925c25d26a0cfb5f2e50
Copying blob sha256:3333307dcd2e4279579646a05a5f99082a61a20906175240445b0e15f73b6d6e
Copying config sha256:da8c0ca2c3f40c149598f1a3b5f3987911fb20187b079c677d3ed0aad2e431d0
Writing manifest to image destination
(microdnf:2): librhsm-WARNING **: 17:37:49.886: Found 0 entitlement certificates
(microdnf:2): librhsm-WARNING **: 17:37:49.889: Found 0 entitlement certificates
Created symlink /etc/systemd/system/sockets.target.wants/dbus.socket → /usr/lib/systemd/system/dbus.socket.
Created symlink /etc/systemd/user/sockets.target.wants/dbus.socket → /usr/lib/systemd/user/dbus.socket.
Created symlink /etc/systemd/system/dbus.service → /usr/lib/systemd/system/dbus-broker.service.
Created symlink /etc/systemd/user/dbus.service → /usr/lib/systemd/user/dbus-broker.service.
(microdnf:1): librhsm-WARNING **: 17:38:13.332: Found 0 entitlement certificates
(microdnf:1): librhsm-WARNING **: 17:38:13.335: Found 0 entitlement certificates
ERROR: Command errored out with exit status 1:
command: /usr/bin/python3 /usr/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmpb9ilp3t3
cwd: /tmp/pip-install-gy8o7nll/llama-cpp-python_a484dc9772b743429b53443580e13b6b
Complete output (40 lines):
*** scikit-build-core 0.10.7 using CMake 3.31.4 (wheel)
*** Configuring CMake...
loading initial cache file /tmp/tmpdy1ivoj_/build/CMakeInit.txt
-- The C compiler identification is GNU 11.5.0
-- The CXX compiler identification is GNU 11.5.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/gcc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/g++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Could NOT find Git (missing: GIT_EXECUTABLE)
CMake Warning at vendor/llama.cpp/cmake/build-info.cmake:14 (message):
Git not found. Build info will not be accurate.
Call Stack (most recent call first):
vendor/llama.cpp/CMakeLists.txt:85 (include)
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Warning: ccache not found - consider installing it for faster compilation or disable this warning with GGML_CCACHE=OFF
-- CMAKE_SYSTEM_PROCESSOR: x86_64
-- Including CPU backend
-- Found OpenMP_C: -fopenmp (found version "4.5")
-- Found OpenMP_CXX: -fopenmp (found version "4.5")
-- Found OpenMP: TRUE (found version "4.5")
-- x86 detected
-- Adding CPU backend variant ggml-cpu: -march=native
CMake Error at vendor/llama.cpp/ggml/CMakeLists.txt:277 (find_program):
Could not find GIT_EXE using the following names: git, git.exe
-- Configuring incomplete, errors occurred!
*** CMake configuration failed
----------------------------------------
ERROR: Failed building wheel for llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
Error: building at STEP "RUN pip install --no-cache-dir --upgrade -r requirements.txt": while running runtime: exit status 1
|
Loading