Ollama
https://ollama.com
Ollama is a local runtime and CLI for running and managing large language models (LLMs) directly on your machine.
- Devel package for openSUSE:Factory
-
12
derived packages
- Links to openSUSE:Factory / ollama
- Download package
-
Checkout Package
osc -A https://api.opensuse.org checkout science:machinelearning/ollama && cd $_ - Create Badge
Refresh
Source Files (show merged sources derived from linked package)
| Filename | Size | Changed |
|---|---|---|
| _link | 0000000124 124 Bytes | |
| _scmsync.obsinfo | 0000000149 149 Bytes | |
| _service | 0000000302 302 Bytes | |
| build.specials.obscpio | 0000000284 284 Bytes | |
| ollama-0.12.10.tar.gz | 0021103971 20.1 MB | |
| ollama-user.conf | 0000000129 129 Bytes | |
| ollama.changes | 0000066728 65.2 KB | |
| ollama.service | 0000000232 232 Bytes | |
| ollama.spec | 0000007090 6.92 KB | |
| sysconfig.ollama | 0000001423 1.39 KB | |
| vendor.tar.zstd | 0005618350 5.36 MB |
Comments 5
maybe add ollama to video group?
What is your usecase? Can't ollama access the GPU without it?
Yes, ollama needs access to /dev/kfd to utilize the GPU
Done!
thank you!!!