summaryrefslogtreecommitdiff
path: root/gnu/packages/patches/llama-cpp-vulkan-optional.patch
AgeCommit message (Collapse)Author
2025-04-25gnu: llama-cpp: Update to 0.0.0-b5013.Morgan Smith
* gnu/packages/machine-learning.scm (llama-cpp): Update to 0.0.0-b5013. [inputs]: Add curl, glslang, and python-gguf. [native-inputs]: bash -> bash-minimal. [source, homepage]: Update URL. [python-scripts]: Rely on upstream to install them. Delete phase. [fix-tests]: Fix an additional test. (python-gguf): Switch to llama-cpp's version. * gnu/packages/patches/llama-cpp-vulkan-optional.patch: Delete. * gnu/local.mk: Unregister patch. Change-Id: Ic297534cd142cb83e3964eae21b4eb807b74e9bc Signed-off-by: Danny Milosavljevic <dannym@friendly-machines.com>
2025-02-08gnu: llama-cpp: Prevent undefined behavior.Danny Milosavljevic
* gnu/packages/patches/llama-cpp-vulkan-optional.patch: Modify. Change-Id: I58816f098a0da2b75cea5f90bda91bcf0bfe60d1
2025-02-02gnu: llama-cpp: Make the runtime check safer.Danny Milosavljevic
* gnu/packages/patches/llama-cpp-vulkan-optional.patch: Make the runtime check safer. Change-Id: If72148fb3e8bf500d35c0987126a788ec410cdbd
2025-01-29gnu: llama-cpp: Enable Vulkan.Danny Milosavljevic
* gnu/packages/patches/llama-cpp-vulkan-optional.patch: New file. * gnu/local.mk (dist_patch_DATA): Add it. * gnu/packages/machine-learning.scm (llama-cpp) [source]: Add patch. [arguments]<#:tests?>: Disable. <#:configure-flags>: Add "-DGGML_VULKAN=ON". <#:phases>[patch-paths]: New phase. [inputs]: Add vulkan-headers, vulkan-loader. [native-inputs]: Add shaderc, bash. Change-Id: Ib7a58f5c7f622213f3aaf5abcd701b17eed80f6b