Age | Commit message (Collapse) | Author |
|
* gnu/packages/machine-learning.scm (llama-cpp): Update to 0.0.0-b5013.
[inputs]: Add curl, glslang, and python-gguf.
[native-inputs]: bash -> bash-minimal.
[source, homepage]: Update URL.
[python-scripts]: Rely on upstream to install them. Delete phase.
[fix-tests]: Fix an additional test.
(python-gguf): Switch to llama-cpp's version.
* gnu/packages/patches/llama-cpp-vulkan-optional.patch: Delete.
* gnu/local.mk: Unregister patch.
Change-Id: Ic297534cd142cb83e3964eae21b4eb807b74e9bc
Signed-off-by: Danny Milosavljevic <dannym@friendly-machines.com>
|
|
* gnu/packages/patches/llama-cpp-vulkan-optional.patch: Modify.
Change-Id: I58816f098a0da2b75cea5f90bda91bcf0bfe60d1
|
|
* gnu/packages/patches/llama-cpp-vulkan-optional.patch: Make the runtime check
safer.
Change-Id: If72148fb3e8bf500d35c0987126a788ec410cdbd
|
|
* gnu/packages/patches/llama-cpp-vulkan-optional.patch: New file.
* gnu/local.mk (dist_patch_DATA): Add it.
* gnu/packages/machine-learning.scm (llama-cpp)
[source]: Add patch.
[arguments]<#:tests?>: Disable.
<#:configure-flags>: Add "-DGGML_VULKAN=ON".
<#:phases>[patch-paths]: New phase.
[inputs]: Add vulkan-headers, vulkan-loader.
[native-inputs]: Add shaderc, bash.
Change-Id: Ib7a58f5c7f622213f3aaf5abcd701b17eed80f6b
|