Ollama
https://ollama.com
Ollama is a local runtime and CLI for running and managing large language models (LLMs) directly on your machine.
- Devel package for openSUSE:Factory
-
14
derived packages
- Links to openSUSE:Factory / ollama
- Download package
-
Checkout Package
osc -A https://api.opensuse.org checkout science:machinelearning/ollama && cd $_ - Create Badge
Refresh
Source Files (show merged sources derived from linked package)
| Filename | Size | Changed |
|---|---|---|
| _link | 0000000124 124 Bytes | |
| _service | 0000000302 302 Bytes | |
| ollama-0.12.6.tar.gz | 0011355405 10.8 MB | |
| ollama-user.conf | 0000000154 154 Bytes | |
| ollama.changes | 0000063564 62.1 KB | |
| ollama.service | 0000000232 232 Bytes | |
| ollama.spec | 0000003776 3.69 KB | |
| sysconfig.ollama | 0000001423 1.39 KB | |
| vendor.tar.zstd | 0005640372 5.38 MB |
Comments 5
maybe add ollama to video group?
What is your usecase? Can't ollama access the GPU without it?
Yes, ollama needs access to /dev/kfd to utilize the GPU
Done!
thank you!!!