llama_cpp_for_radxa_dragon_.../.devops
2024-05-20 01:17:03 +02:00
..
nix llama : remove MPI backend (#7395) 2024-05-20 01:17:03 +02:00
cloud-v-pipeline
full-cuda.Dockerfile
full-rocm.Dockerfile
full.Dockerfile
llama-cpp-clblast.srpm.spec
llama-cpp-cuda.srpm.spec
llama-cpp.srpm.spec
main-cuda.Dockerfile
main-intel.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
main-rocm.Dockerfile
main-vulkan.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
main.Dockerfile
server-cuda.Dockerfile
server-intel.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
server-rocm.Dockerfile
server-vulkan.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
server.Dockerfile
tools.sh