llama.cpp
https://github.com/ggerganov/llama.cpp
The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud.
- Devel package for openSUSE:Factory
-
7
derived packages
- Links to openSUSE:Factory / llamacpp
- Download package
-
Checkout Package
osc -A https://api.opensuse.org checkout science:machinelearning/llamacpp && cd $_
- Create Badge
Refresh
Refresh
Source Files (show merged sources derived from linked package)
Filename | Size | Changed |
---|---|---|
_link | 0000000124 124 Bytes | |
_service | 0000000125 125 Bytes | |
llamacpp-6269.tar.gz | 0025606202 24.4 MB | |
llamacpp.changes | 0000050632 49.4 KB | |
llamacpp.spec | 0000006277 6.13 KB |
Comments 0