llama.cpp
https://github.com/ggerganov/llama.cpp
The main goal of llama.cpp is to enable LLM inference with minimal setup and state-of-the-art performance on a wide variety of hardware - locally and in the cloud.
- Links to science:machinelearning / llamacpp
- Has a link diff
- Download package
-
Checkout Package
osc -A https://api.opensuse.org checkout home:alucardx:ai/llamacpp && cd $_ - Create Badge
Refresh
Source Files (show merged sources derived from linked package)
| Filename | Size | Changed |
|---|---|---|
| 0001-dl-load-path.patch | 0000000482 482 Bytes | |
| _link | 0000000131 131 Bytes | |
| _service | 0000000125 125 Bytes | |
| llamacpp-5889.tar.gz | 0025176888 24 MB | |
| llamacpp.changes | 0000041196 40.2 KB | |
| llamacpp.spec | 0000004681 4.57 KB |
Comments 0