Ollama
https://ollama.com
Ollama is a local runtime and CLI for running and managing large language models (LLMs) directly on your machine.
- Devel package for openSUSE:Factory
-
14
derived packages
- Links to openSUSE:Factory / ollama
- Has a link diff
- Download package
-
Checkout Package
osc -A https://api.opensuse.org checkout science:machinelearning/ollama && cd $_ - Create Badge
Refresh
Source Files (show merged sources derived from linked package)
| Filename | Size | Changed |
|---|---|---|
| _link | 0000000124 124 Bytes | |
| _scmsync.obsinfo | 0000000149 149 Bytes | |
| _service | 0000000302 302 Bytes | |
| fix-mlxrunner-tests.diff | 0000000253 253 Bytes | |
| ollama-0.17.6.tar.gz | 0023220560 22.1 MB | |
| ollama-user.conf | 0000000129 129 Bytes | |
| ollama.changes | 0000077881 76.1 KB | |
| ollama.service | 0000000232 232 Bytes | |
| ollama.spec | 0000007274 7.1 KB | |
| sysconfig.ollama | 0000001423 1.39 KB | |
| vendor.tar.zstd | 0008850040 8.44 MB |
Comments 5
maybe add ollama to video group?
What is your usecase? Can't ollama access the GPU without it?
Yes, ollama needs access to /dev/kfd to utilize the GPU
Done!
thank you!!!