diff options
| author | Danilo M. <danix@danix.xyz> | 2026-04-04 13:42:17 +0200 |
|---|---|---|
| committer | Danilo M. <danix@danix.xyz> | 2026-04-04 13:42:17 +0200 |
| commit | 54e835e12dc0ae538ac95eaf22859cca481077a6 (patch) | |
| tree | 2d5edaf4f2f1ede1d8d0afa3829e91e9c1655f57 | |
| parent | a684ccf91608044667f0d2428c243ad65d7fd4a2 (diff) | |
| download | my-slackbuilds-54e835e12dc0ae538ac95eaf22859cca481077a6.tar.gz my-slackbuilds-54e835e12dc0ae538ac95eaf22859cca481077a6.zip | |
llama.cpp-vulkan: update to b8661
| -rw-r--r-- | README.md | 2 | ||||
| -rw-r--r-- | llama.cpp-vulkan/llama.cpp-vulkan.SlackBuild | 2 | ||||
| -rw-r--r-- | llama.cpp-vulkan/llama.cpp-vulkan.info | 6 |
3 files changed, 5 insertions, 5 deletions
@@ -33,7 +33,7 @@ Each package lives in its own top-level subfolder: |---------|----------|------|-----|---------|--------| | hstr | ✅ | not tested | [hstr](https://slackbuilds.org/repository/15.0/system/hstr/) | 3.1 | 3.2 | | kitty-bin | ✅ | not tested | ❌ | 0.46.2 | 0.46.2 | -| llama.cpp-vulkan | ✅ | not tested | ❌ | b8611 | b8611 | +| llama.cpp-vulkan | ✅ | not tested | ❌ | b8661 | b8661 | | qarma | ✅ | not tested | ❌ | 1.1.0 | 1.1.0 | | opencode-bin | ✅ | not tested | ❌ | 1.3.13 | 1.3.13 | diff --git a/llama.cpp-vulkan/llama.cpp-vulkan.SlackBuild b/llama.cpp-vulkan/llama.cpp-vulkan.SlackBuild index 07c72c1..34c3d00 100644 --- a/llama.cpp-vulkan/llama.cpp-vulkan.SlackBuild +++ b/llama.cpp-vulkan/llama.cpp-vulkan.SlackBuild @@ -26,7 +26,7 @@ cd $(dirname $0) ; CWD=$(pwd) PRGNAM=llama.cpp-vulkan SRCNAM=llama.cpp -VERSION=${VERSION:-b8648} +VERSION=${VERSION:-b8661} BUILD=${BUILD:-1} TAG=${TAG:-_SBo} PKGTYPE=${PKGTYPE:-tgz} diff --git a/llama.cpp-vulkan/llama.cpp-vulkan.info b/llama.cpp-vulkan/llama.cpp-vulkan.info index 4de8c29..9333b7e 100644 --- a/llama.cpp-vulkan/llama.cpp-vulkan.info +++ b/llama.cpp-vulkan/llama.cpp-vulkan.info @@ -1,8 +1,8 @@ PRGNAM="llama.cpp-vulkan" -VERSION="b8648" +VERSION="b8661" HOMEPAGE="https://github.com/ggml-org/llama.cpp" -DOWNLOAD="https://github.com/ggml-org/llama.cpp/archive/b8648/llama.cpp-b8648.tar.gz" -MD5SUM="4f971e9b14d2480732eecc21d4f75387" +DOWNLOAD="https://github.com/ggml-org/llama.cpp/archive/b8661/llama.cpp-b8661.tar.gz" +MD5SUM="97a0d99a27f5204db8b52302e60dd0bf" DOWNLOAD_x86_64="" MD5SUM_x86_64="" REQUIRES="" |
